|
BLOG
|
@ClaudiaGIS Hi, yes sure, please email me as bharold at esri dot com - thanks.
... View more
04-27-2022
02:32 PM
|
0
|
0
|
2336
|
|
IDEA
|
@Anonymous User Can you comment? For Data Interoperability users, regarding Phil's question - yes we can 'see' Enterprise and Online feature services. Regarding the wider topic of no-code integration options like Power Automate, Make etc. I'm working on a sample using Data Interoperability where continuous integration is implemented and verified with a combination of scheduled feature service refreshes and triggered web tool actions in Enterprise, with the result validated from the webhook change set.
... View more
04-27-2022
09:06 AM
|
0
|
0
|
2447
|
|
POST
|
That looks like the file didn't download properly, we should have a better error message that does a checksum. Try downloading again, if it still fails open a support call.
... View more
04-08-2022
06:40 AM
|
1
|
0
|
1844
|
|
IDEA
|
I find that SQL statements for the use case can blow up on you when row counts are in the hundreds of thousands, for example leveraging the output of Generate Near Table.
... View more
04-04-2022
08:53 AM
|
0
|
0
|
521
|
|
POST
|
Hi Brittany, your feature service will have datetime fields stored in UTC so when upserting using Data Interoperability use the available datetime transformation functions to offset to UTC before writing.
... View more
03-31-2022
10:51 AM
|
1
|
0
|
1662
|
|
BLOG
|
Public safety geocoders in particular like to give cross-street names to first responders navigating to street addresses, their local knowledge helps them decide what streets 'book end' the block they are heading for. Building a locator that does this starts with getting cross street data into your street centerline feature class you are using for locator reference data. Here are some streets in Redding, California where I have derived cross street fields for a feature class, the pop-up relates to the selected street segment and I have highlighted the added cross street fields: Inspecting Cross Streets Made By The Tool ArcGIS Data Interoperability comes to the rescue here, especially the TopologyBuilder transformer. In the past I have done all sorts of math in Python to calculate data structures I can query for cross streets but I much prefer the no-code approach. If you go way back in ArcGIS, like some people I know, you may recall that arc-node topology was baked into the storage model for polygonal features. This is no longer the case in a geodatabase, nor is it easy to make - except with ArcGIS Data Interoperability. So that is what I used in the ETL tool in the post download (requires ArcGIS Pro 3.3 and the ArcGIS Data Interoperability extension): ETL tool that makes cross streets I'll let you surf the tool yourselves, but in thumbnail I read in street reference data, send the geometry, ObjectID and a full street name attribute into some processing that figures out cross streets, then join these back onto the street features and write out a new feature class. I don't include the data but it is publicly available so you can test drive the tool for yourself after repairing paths. In the download toolbox there is also a model that creates a locator (also included) with custom output fields for cross streets, and here it is in action with a 'What's Here?' query in a map. You can get to the area by a geocode to 2500 CELESTIAL ST, Redding, 96002 with the supplied locator. Cross Streets Returned By The Locator In production, street centerlines would often be used as one primary table in a locator that also uses point address data and zone features, and to access the cross street values from a geocode service you might have to use a category filter of 'Street Address'. If you need help putting this together please contact your local Esri representative or use the comments feature in this post, I'll be listening!
... View more
03-23-2022
11:32 AM
|
3
|
6
|
5109
|
|
IDEA
|
Hi, FlatGeobuf is in the GDAL drivers we ship in Pro. from osgeo import gdal, ogr
vformats, rformats = [], []
for i in range(ogr.GetDriverCount()):
name = ogr.GetDriver(i).GetName()
if not name in vformats:
vformats.append(name)
vformats = sorted(vformats)
for i in range(gdal.GetDriverCount()):
name = gdal.GetDriver(i).LongName
if not name in rformats:
rformats.append(name)
rformats = sorted(rformats)
print('Vector Formats:\n')
for format in vformats:
print(format)
print('\n')
print('Raster Formats:\n')
for format in rformats:
print(format)
print('\n') Not super easy but not too hard either to write a converter, let us know if you need help. Data Interoperability extension will support FlatGeoBuf at Pro 3.2.
... View more
03-21-2022
06:40 AM
|
0
|
0
|
3546
|
|
IDEA
|
Like this: https://pm.maps.arcgis.com/home/item.html?id=e638afe0695a4ad38388cb8d9b350446 Selections are propagated based on shared values of two fields, one in each layer/view.
... View more
03-17-2022
09:40 AM
|
0
|
0
|
3661
|
|
IDEA
|
This is the old Keyfile select capability from Arcplot, it is very much needed in Pro. I have emulated it with a script tool but its not ideal.
... View more
03-17-2022
09:27 AM
|
0
|
0
|
3668
|
|
BLOG
|
This article is another spinoff from the recent (at writing) 2022 Esri Partner and Developer conferences in Palm Springs, California, where an attendee asked if we could see where his Android phone goes. Well we can't, but the phone owner sure can, and as usual with ArcGIS Data Interoperability, it's no-code easy! If you enable location history on your Android device you can download periodic snapshots of the location data, along with many other variables, from Google's Takeout site. There are a lot of things you can package up in any download, if you're only interested in location data deselect all options and scroll about 2/3 the way through the options to find Location History, and select it: Location History While setting up your export there are some options to send the data to a cloud store, these are accessible to Data Interoperability too: Export Options When you get some data you'll see it is in month chunks as JSON documents, for example 2022_MARCH.json. There are a number of object types in the data; places visited, activities that describe movement (like driving and walking, but many more possibilities), places parked at and child places, where visiting a place was judged to result in secondary place visits - like shopping. Everything has coordinates and UTC time stamps and places have street addresses and business names. The data is probabilistic, meaning Google's guess at how you moved and where you visited is given a rank amongst a number of possibilities. You'll see in the tools in the post download I only preserved the highest ranking option. I rate the accuracy as very high. So how did I ingest this data and what does it look like? I created a custom format for the JSON (details below) and some raw data looks like this when it lands in a geodatabase: Raw Location History The linework isn't very inspiring at first glance (but see below); while start and end points are usually accurate, waypoints are only captured at sparse intervals, something like every 20 minutes (don't quote me on that, I haven't done a rigorous analysis). At small scales the 'routes' are way off: Activity Extent However, we can uplift the data by replacing the routes with those solved by the ArcGIS Online Routing Service. Here is how the data looks after that treatment: Uplifted Routes Much better! The routes are not perfectly accurate but do use the start, waypoint and stop coordinates, plus traffic conditions if the data isn't too old. If we go back to the full extent we can see some activities in mid extent (shopping?) and in Palm Springs (working?). Destination Activities First the supposed shopping: Desert Hills at Cabazon I can recommend the date shake at the place famous for them :-). How about the final destination? Palm Springs Convention Centre The Esri conference! I will not pick through the tools in detail, but in the download you will find: File Description GoogleLocationHistory.fmw Workspace to create the custom format RouteActivitySegments.fmw Workspace to route the custom format GOOGLE_LOCATION_HISTORY_JSON.fds Custom format definition Copy the fds file into your default custom format folder (you can set up other shares in FME options too). That is a path like this: C:\Users\<username>\Documents\FME\Formats If you experience a crash of Pro when using the custom format you have hit a known bug which will be fixed in Pro 3.0, in the meantime just reboot your machine and the crash will not reoccur. I used Data Interoperability for Pro 2.9 so you'll need that release or later. The routing functionality requires an ArcGIS Online account with routing permissions and will consume credits. I used the default travel modes for my organization, if you want to tune yours see this help topic. If you are new to Data Interoperability there is an Easy Button. Install and license the extension, this delivers a toolset named Data Interoperability Tools, in there is a tool Quick Import. Open that tool, search for the input custom format Google Location History JSON choose a destination file geodatabase name and run the tool - it will deliver the data (unrouted) into the new file geodatabase. If you want the data routed you can edit and run RouteActivitySegments.fmw with the Workbench app or create a new ETL tool in a toolbox with RouteActivitySegments.fmw as its source and run it interactively. Now you are all set, go get that location history into your GIS!
... View more
03-17-2022
07:17 AM
|
4
|
0
|
6254
|
|
POST
|
Kathy, just to clear one thing up, if you did have Data Interop on your server it is not necessary for client machines to also have Data Interop licensed or installed to call a web tool that uses Data Interop on the server. I can picture one architecture that might work in the absence of a web tool, namely setting up a scheduled task on the VM with high frequency that checks for and processes inputs it can access.
... View more
03-07-2022
07:15 AM
|
0
|
0
|
2201
|
|
POST
|
Thanks Renato, a related topic: https://community.esri.com/t5/arcgis-data-interoperability-blog/building-a-data-driven-organization-part-2-go/ba-p/1081060
... View more
03-07-2022
06:31 AM
|
0
|
2
|
2215
|
|
BLOG
|
Very often making information products means implementing a feature classification system using business rules of a user community. Examples abound: administrative divisions of land, classes of asset, agricultural production, forms of land cover, quality metrics, ownership hierarchies, and the example I'm running with in this post - NAICS industrial classifications. There are more than a thousand industry classes when you go to the most granular level. You will appreciate that querying them can be complex and error prone. This post is about you as a data curator making complex queries simple for your colleagues by using ArcGIS Data Interoperability. With NAICS data, queries can look like this, needing just a few clicks: Selecting Data Classes And not like this, scrolling though 1000+ rows and guessing search terms: Industry Classes by SQL Query Best of all - you don't have to write a line of code. As a bonus extra I'm going to include some tips on precision geocoding of this industry data. Let's get going! My subject industry data is from an Open Data site for the City of Seattle, specifically their Active Business License Tax Certificate dataset. Seattle use Socrata technology, which is directly readable by Data Interoperability once you download and install the relevant Hub package. Socrata is a popular choice for open data sites, typically being used for purely tabular data, as is the case here. My hypothetical use case is I run a business specializing in pest control and I want to map potential customers in the food business for target marketing. I'm going to build a Spatial ETL tool that lets me use smart pickers for this complex data and appropriate geocoding options to make the data spatial, then I can share the tool with anyone interested in the same business data but perhaps for different industry sectors. The North American Industrial Classification System is used in the United States, Canada and Mexico and is broadly compatible with the UN's International Standard Industrial Classification System, there are analogs worldwide. The business license dataset is not spatial but is very well maintained and contains good address data. I'm going to geocode it with tight quality control; I'll do this by confining geocode matches to high precision point of interest (POI, meaning matches include the business name) categories available in the ArcGIS Online World Service - the ones that match my industry classes of interest. Successful POI geocodes validate business names and can return things like website and phone number details. I will also allow the high-precision cases of normal address geocoding, if a POI match cannot be found we use the street address. No matching to any sort of zone or area will be allowed. I will geocode to match the industry classifications of interest, with a fallback to precise types of street address. Here is the tree-pick experience for geocode category, the categories themselves are built into the geocode service but the picker is enabled by Data Interoperability, just like the industry class one: Geocode Categories There are several categories I picked that are out of sight in the screen capture above, but navigating to them was very easy given the tree structure and hierarchy of available choices. So I can easily choose my Socrata-hosted industries of interest to extract that data, and I can pick how I want the data geocoded by category. Is there anything else I need to control? Yes, when inspecting the data I could see a few food business licensees located across the border in Canada, so to support specifying two allowed countries I built a picker for the country code parameter: Country Code Country code is a hard filter, no matches can stray into another country no matter how much an address is like one in another country. For completeness I also built a picker for the language code parameter which you can read about at the same link as the one above for country code: Language Code For my data, specifying language code wasn't useful, but if you need to work multilingually you can. Now let's get to the real value proposition for this post, showing you how to build smart ETL tool parameters. The 'trick' is using advanced options when building Choice parameters. In my tool that extracts and geocodes business data I have a number of user parameters. To make each tree picker I used the Import option (see to the right in the Choice Configuration area) to read a CSV file supplying the tree definition. The CSV file has Display and Value columns that the importer is looking for: Choice Tree Parameter Configuration The CSV files are in the post download, the one for industry classes looks like this: Industry Parameter Import CSV The only remarkable detail for this file is I used a pipe delimiter and not a comma as the display values contain commas. How did I make the CSV files? They are all built with one Data Interoperability workspace - MakeParameterCSVs.fmw, in the post download. Start Workbench from the Analysis ribbon in Pro, open the file and we can walk through it. MakeParameterCSVs To make Categories.csv and Countries.csv for geocoding parameter purposes the geocode service definition is queried and the JSON response picked apart; this is a simple example of handling JSON from the web. LangCodes.csv is made by querying the web help. In the post download is a Python script GeocodeServerProperties.py that will give you an idea of all available properties in the service. As a side note, if you are geocoding against a StreetMap geocode service on-premise the same properties are available. To make NAICSCodes.csv the appropriate NAICS spreadsheet download URL is read by an Excel reader and the desired hierarchy built. The Socrata dataset is also read to make sure only industry codes that exist in Seattle will be available in the destination tool parameter. With the desired CSV files in place my ETL tool - BusinessExtractor - can be configured. Let's walk through that: BusinessExtractor Workspace Basically Socrata is read and geocoded with the findAddressCandidates REST endpoint. If the data was bigger in row count I would have used geocodeAddresses. I use 4 concurrent threads in the HTTPCaller and performance was excellent. Note that you will need to edit the ArcGISOnlineTokenGetter to use your credentials, and you will need geocoding privilege in Online. Seasoned users will know there is a Geocoder transformer I could have used, but it doesn't support category filtering or language code. The Geocoder transformer is scheduled for a rewrite so stay tuned for a post on that! Geocoding parameters are taken from the input pickers. The most challenging part was parsing the category parameter to make it correctly formed for the REST API. Here is how categories arrive from the user interface (order does not matter)... "African Food" "American Food" "Argentinean Food" "Australian Food" "Austrian Food" Intersection "Point Address" "Street Address" Subaddress "BBQ and Southern Food" Bakery "Balkan Food" "Belgian Food" Bistro "Brazilian Food" Breakfast Brewpub "British Isles Food" Burgers "Cajun and Creole Food" "Californian Food" "Caribbean Food" "Chicken Restaurant" "Chilean Food" "Chinese Food" "Coffee Shop" "Continental Food" Creperie "East European Food" "Fast Food" "Filipino Food" Fondue "French Food" "Fusion Food" "German Food" "Greek Food" Grill "Hawaiian Food" "Ice Cream Shop" "Indian Food" "Indonesian Food" "International Food" "Irish Food" "Italian Food" "Japanese Food" "Korean Food" "Kosher Food" "Latin American Food" "Malaysian Food" "Mexican Food" "Middle Eastern Food" "Moroccan Food" "Other Restaurant" Pastries Pizza "Polish Food" "Portuguese Food" Restaurant "Russian Food" "Sandwich Shop" "Scandinavian Food" Seafood Snacks "South American Food" "Southeast Asian Food" "Southwestern Food" "Spanish Food" "Steak House" Sushi "Swiss Food" Tapas "Thai Food" "Turkish Food" "Vegetarian Food" "Vietnamese Food" Winery Butcher "Candy Store" Grocery Market "Wine and Liquor" "Bar or Pub" ...and here is how they go to the category parameter in the API (again, order does not matter): Intersection,Subaddress,Bakery,Bistro,Breakfast,Brewpub,Burgers,Creperie,Fondue,Grill,Pastries,Pizza,Restaurant,Seafood,Snacks,Sushi,Tapas,Winery,Butcher,Grocery,Market,African Food,American Food,Argentinean Food,Australian Food,Austrian Food,Point Address,Street Address,BBQ and Southern Food,Balkan Food,Belgian Food,Brazilian Food,British Isles Food,Cajun and Creole Food,Californian Food,Caribbean Food,Chicken Restaurant,Chilean Food,Chinese Food,Coffee Shop,Continental Food,East European Food,Fast Food,Filipino Food,French Food,Fusion Food,German Food,Greek Food,Hawaiian Food,Ice Cream Shop,Indian Food,Indonesian Food,International Food,Irish Food,Italian Food,Japanese Food,Korean Food,Kosher Food,Latin American Food,Malaysian Food,Mexican Food,Middle Eastern Food,Moroccan Food,Other Restaurant,Polish Food,Portuguese Food,Russian Food,Sandwich Shop,Scandinavian Food,South American Food,Southeast Asian Food,Southwestern Food,Spanish Food,Steak House,Swiss Food,Thai Food,Turkish Food,Vegetarian Food,Vietnamese Food,Candy Store,Wine and Liquor,Bar or Pub Here is the end result, a high proportion of point of interest geocodes with the vast majority of the remainder geocoded to rooftop locations (PointAddress and SubAddress). Exactly what I was looking for to use in my pest control business! Seattle Food Businesses The workspaces, CSV files and toolbox are in the post download. Now you can build smart input parameters for your ETL tools!
... View more
02-24-2022
05:19 AM
|
1
|
2
|
1638
|
|
BLOG
|
Thanks, I'm so deep in the weeds I forget to demystify jargon.
... View more
02-11-2022
11:04 AM
|
0
|
0
|
659
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 10-03-2025 05:45 AM | |
| 1 | 11-21-2025 05:34 AM | |
| 2 | 10-06-2025 05:36 AM | |
| 1 | 11-03-2025 05:14 AM | |
| 3 | 11-04-2025 08:41 AM |
| Online Status |
Offline
|
| Date Last Visited |
yesterday
|