|
POST
|
Hi, I don't know if it is a problem the software is installed in a non-standard path, the support people will be able to advise. Regardless you are unable to start Workbench from the executable so there is something wrong with licensing, fmeworkbench.exe should start.
... View more
06-10-2021
06:40 AM
|
1
|
2
|
2989
|
|
POST
|
I haven't seen that install path before, can you open a support call please so we can help you get set up correctly.
... View more
06-10-2021
06:01 AM
|
1
|
4
|
2992
|
|
POST
|
Hi Murat If Data Interop is installed and licensed on the server you should be able to start Workbench from this path (D drive in your case). Log onto the server as the arcgis service owner account. "D:\Program Files\ESRI\Data Interoperability\Data Interoperability AO11\fmeworkbench.exe"
... View more
06-10-2021
05:48 AM
|
1
|
6
|
2994
|
|
IDEA
|
@EricEagle Data Interoperability is a closely guarded secret! Let me know if you need help.
... View more
06-08-2021
05:55 AM
|
0
|
0
|
4159
|
|
IDEA
|
Hello Open I manage the Data Interoperability extension which can output OBJ and STL out of the box. We don't have a PLY writer but that looks like reasonably simple to tackle with Python. Data Interop is an extension product. If you can share a terrain raster we can see if it meets your needs. Work with your Esri account representative or message me directly.
... View more
06-07-2021
07:57 AM
|
0
|
0
|
4195
|
|
POST
|
Murat have you checked Data Interop is installed on the server? This is separate to being licensed.
... View more
06-02-2021
06:21 AM
|
1
|
8
|
3019
|
|
POST
|
Hi, the single most common 'invalid tool' cause is not having installed and licensed Data Interoperability on the server, please check this. If this checks out and you still have issues please open a support call and reference this thread so we can work with the analyst.
... View more
05-03-2021
02:07 PM
|
1
|
10
|
3077
|
|
POST
|
That message is normal, it just means the data wasn't a map layer with a selection, it looks like you got 22 features written to PDF.
... View more
04-28-2021
05:25 AM
|
0
|
0
|
1602
|
|
POST
|
Zoran try navigating to a folder you know exists and replace the default path, the ellipsis picker will do that for you.
... View more
04-27-2021
01:30 PM
|
0
|
2
|
1625
|
|
BLOG
|
Joe & everyone, see I updated the tool to double-dip on parallelized geocoding by managing two queues.
... View more
04-27-2021
05:56 AM
|
1
|
0
|
3683
|
|
BLOG
|
Hi Joe, Data Interoperability 2.8 beta will be available early next week.
... View more
04-23-2021
10:02 AM
|
2
|
0
|
3815
|
|
BLOG
|
If your system of record is one of the many options available to you which are not supported geoprocessing workspaces in ArcGIS Pro you might be tempted to adopt less than optimal workflows like manually bouncing data through CSV or file geodatabase just to get the job done. This blog is about avoiding that, simply reading your data from where it lives, geocoding it and writing the output where you want it, all from the comfort of your Pro session, or even as an Enterprise web tool. Here are 1 million addresses I geocoded from Snowflake and wrote back to Snowflake without the data touching the ground: An aside - I got the Snowflake data into a memory feature class using this tool. We're seeing people who need flexibility in one or both storage technologies where their address data is managed and where the geocoded spatial data lives. Data has gravity and there is no need to fight that. Lets see how to achieve this. Full disclosure, in the blog download is the fmw source for a Data Interoperability ETL tool for ArcGIS Pro 2.8, which at writing isn't released. I'm using that as it has the FME 2021 engine which supports parallelized http requests, which is relevant for performance. If you don't have Pro 2.8 then ask our good friends at Safe Software for an evaluation of FME 2021 to surf the workspace. Here is how it looks when edited, and I always like this bit, you can't see any code because there isn't any! We're 21% through the 21st century, who writes code any more just to get work done? 😉 Partly I'm talking about moving data around, but you already knew you could do that, its the batch geocoding that is the value here, lets dig into that. When you're batch geocoding data that is in flight between systems of record the Geocode Addresses geoprocessing tool isn't suitable, it requires table input and writes feature classes. I'm using the geocodeAddresses REST endpoint. To make the endpoint available I published a StreetMap Premium locator to my portal. I could have used ArcGIS Online's World Geocoding Service or a service published from a locator I built too, the API is identical. Which way you go will depend on a few decision points: StreetMap locators on-premise don't send your data out to the internet StreetMap locators can be scaled how you want on your own hardware StreetMap locators have a fixed cost Online requires no setup but has a variable cost based on throughput Online sends your address data out to the internet (albeit securely) I want to make the point you can scale your batch geocoding how you want, so I went with StreetMap. Now how to drive: Under the covers the geocoding engine considers an address in its entirety, how well it matches all known addresses, picking a winner (or failing to). You will notice you can supply address content in parts (base address, zone fields) or as one value - SingleLine. If you supply address parts the engine in fact concatenates them in some well-known orders based on address grammar for the area covered before submitting them to the engine. Assuming you know the structure of your address data as well as Esri does, you may as well do this yourself and supply the whole address as SingleLine values, so you'll see this is what I do in my sample. The only other data dependency is you'll need an integer key field in your data you can name ObjectID, this comes out the other end as the ResultID field which you can use to join back to your source data. Lets go through the parameters of the ETL tool. CountryCode is a hard filter for country. If your locator supports multiple countries like StreetMap North America does and your data is from known countries, customize and use this parameter. If you only have a single country locator don't supply a value. MatchOutOfRange is a switch for finding addresses a small way beyond known house number ranges for street centerline matches. The BatchSize should not be larger than the maximum allowed for your locator. You can see this number in the locator service properties. The blog download also has a handy Python script for reporting service properties, edit it for your URL. Categories allow you to filter the acceptable match types, you could customize this parameter to support only certain types of Address matches for example. LocationType lets you select rooftop or roadside coordinates for exact house point matches. GeocodeConcurrency isn't a locator property, its a property of the batch handling of the ETL tool. My sample uses 4 - meaning at any time the service is handling 4 batch requests. Make sure to configure as many geocode service instances as you need and make this property agree. In my case I don't have serious metal to run on, I have a small virtual machine somewhere in the sky, but if you have a need for a 64-instance setup then go for it. At some point though you'll be limited by how fast you can read data, send it out and catch the results. My guess is few people will need more than 8 instances in a large geography. If you're a Data Interoperability or FME user you'll know there is already a Geocoder transformer in the product and it can use Enterprise or Online locators. However it works on one feature at a time and I want multiple concurrent batch processing for performance. In my particular case I'm using Snowflake and its possible to configure an external function to geocode against an Esri geocode service, this is also one feature at a time (but stay tuned for news from Esri on this function in Snowflake). That's pretty much it, surf the ETL tool for seeing how I made batch JSON and how to interpret the result JSON, and implement your own workflows. Enjoy! But wait there's more! I edited the tool 27th April 2021 to squeeze more performance (30%!) out of the already parallelized geocode step by managing the http calls in two queues. While one set of 4 batches is processing another set is uploading. This is the bookmark to look for.
... View more
04-23-2021
08:55 AM
|
1
|
13
|
5437
|
|
POST
|
This is very strange, please open a support call, the analyst will be interested in things like antivirus and file share configurations, such as are you using network attached storage.
... View more
04-19-2021
08:57 AM
|
0
|
0
|
795
|
|
BLOG
|
Continuous Integration / Continuous Delivery (CI/CD) is a software development methodology. I'm borrowing the term and applying it to data, not software, specifically data that endlessly changes over time and that you need to integrate into your systems of record, well, continuously! Always one to jump in the deep end, my example is taking a feed of public transport vehicle positions (accessed via a REST API) that refreshes every 30 seconds and pushing the data through to a hosted feature service in an ArcGIS Enterprise portal and also into a spatially enabled table in Snowflake. I will use a web tool for the processing as high availability is obviously advisable. Here are bus, train and ferry positions a few moments ago (early Saturday local time) in Auckland, NZ. Bus, train & ferry positions in Auckland I'm probably working at the extreme end of continuous bulk data integration frequency, I expect the vast majority of integrations are performed at intervals of hours or days, but at least you'll know what can be achieved. Bear with me while I have some fun with my integration scenario 😉. Lets say I work at Fako (a fictional name), who have solved the first-mile/last-mile problem of e-commerce. Fako has identified that commuter networks are very efficient for bringing people and retail goods together, buying or selling, when the riders are the buyers and sellers. In partnership with transport operators we remove a few rows of seats in each vehicle, both sides of the aisle, and replace them with grids of smart storage lockers to which internet shopping can be 'delivered' by our stevedores at our warehouses co-located with transport terminals. Buyers order from any web site for delivery on a day and route, sellers sell on our website and we transfer items to routes anywhere on the network. We're freight forwarders. Fako's mobile app lets customers use their phone to unlock the locker their item is in any time during their trip. Some lockers are refrigerated, we have our own meal kit line. A very popular feature of our mobile app lets riders bid in auctions for abandoned items. Fako pays transport operators the equivalent of a rider fare per item, greatly boosting their effective ridership. Fako doesn't have to buy a fleet of delivery vehicles and the transport operators are getting increased revenue. Business is booming! Fako's back end systems run on Snowflake. To make everything work, Fako needs to maintain the network status continuously as spatially enabled Snowflake objects. Let's see how! First the boring way, for which I happen to have disqualified myself by choosing a frequency Windows scheduled tasks don't support, would be to copy a Spatial ETL workspace source fmw onto my Data Interoperability server and configure a scheduled task based on the command line documented in a log file from a manual run as the arcgis user: Command-line to run this workspace: "C:\Program Files\ESRI\Data Interoperability\Data Interoperability AO11\fme.exe" C:\Users\arcgis\Desktop\ContinuousIntegration\VehiclePositions2Snowflake.fmw You should carefully consider this option for your situation, it is robust and simple. Now for the non-boring way. Like I said, I pushed myself in this direction by working with data that updates in bulk at high frequency. I make a web tool that performs the integration then calls itself after waiting for the source data to update. Whoa a web tool that calls itself, no webhook and no scheduling? Its crazy simple (possibly also just crazy). I made two Spatial ETL tools, one real and one a dummy that does nothing, but has the same name and parameters (none in this case). I shared a history item of the dummy version of VehiclePositions2Snowflake as a web tool and recorded the submitJob URL. It is important the web tool be asynchronous so when it gets called it doesn't block the workspace waiting for a response: https://dev99999.esri.com/server/rest/services/VehiclePositions2Snowflake/GPServer/VehiclePositions2Snowflake/submitJob Then edit the real ETL tool, in the final HTTP step, to call the submitJob URL. Run the tool and from its history item overwrite the dummy web tool. I'll let you surf the tool yourself but basically the upper stream fetches the vehicle data and synchronizes it to the portal and Snowflake and the lower stream waits for this and for 30 seconds to elapse then makes the HTTP call. The Self Integrating ETL Tool Then just run the web tool once manually and you're off to the races, it will repeat endlessly. I'm sitting here refreshing my Pro map (no cache on the feature service layer) and seeing the transport fleet move around. In Snowflake my data is also refreshing: Snowflake console Back to some boring details, don't forget when publishing Spatial ETL tools as web tools Data Interoperability must be installed and licensed on each tool hosting server and when using web connection or database credentials like I am here, go to the Tools>FME Options dialog and export the required credentials (right click for the menu) to XML files, put these on your server and import them into the Workbench environment as the arcgis service owner. If manually running a workspace on the server you might have to change the Python environment too. Lastly, while the blog download has an FMW file the tool you publish to your server should have an embedded source. Now that was fun!
... View more
04-16-2021
01:27 PM
|
4
|
0
|
3474
|
|
IDEA
|
Thanks for this idea Phil, on re-reading your idea I'm not sure I can read all attributes and geometry of deleted features given GobalID. This is possible with versioned enterprise geodatabases but as far as I know not feature services but I'll check.
... View more
03-09-2021
12:19 PM
|
0
|
0
|
3230
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 10-03-2025 05:45 AM | |
| 1 | 11-21-2025 05:34 AM | |
| 2 | 10-06-2025 05:36 AM | |
| 1 | 11-03-2025 05:14 AM | |
| 3 | 11-04-2025 08:41 AM |
| Online Status |
Offline
|
| Date Last Visited |
16 hours ago
|