|
POST
|
The Esri product is Data Interoperability extension which lets you make Spatial ETL tools in Pro, the FME product lives outside Pro. The support channels are separate at the customer level.
... View more
12-10-2020
09:43 AM
|
0
|
2
|
11255
|
|
POST
|
Adam if you're using FME the best way to get support is via Safe's support channel. https://community.safe.com/s/support Chat is very effective as a first step.
... View more
12-10-2020
09:23 AM
|
0
|
0
|
11270
|
|
BLOG
|
Data Interoperability integrates hundreds of data formats into ArcGIS and an old favorite workflow in ArcMap and ArcCatalog (personal favorite right there) was to create Interoperability Connections in the catalog tree and use them in maps and geoprocessing tools. The default behavior for an interoperability connection is to refresh once per session, this saves a cache of the data to disk. Here is how a connection looks: This has been a popular workflow, but in Pro there are technical challenges to delivering the same experience. In Pro 2.6 we added the Data Inspector app to the Data Interop control in the Analysis ribbon, which lets you explore data and optionally export all or some of it to another format, like the home geodatabase for your project. That is good but Data Inspector is a separate process so the data isn't immediately useful in your project. It turns out a simple way to bring interoperability data sources into a project has been sitting under our noses for a while - creating feature classes and tables in the memory workspace and automatically add them to the active map or scene. The blog download has a toolbox with three model tools in it: Here is Interoperability Connection in edit mode, you can see it includes Interoperability Features and Interoperability Tables as submodels. To bring interoperability data into your map or scene simply run Interoperability Connection as a tool, the submodels will recursively search for feature classes and tables in the output file geodatabase created by the system tool Quick Import and copy them into the memory workspace and add them to your table of contents. Quick Import in the model creates a file geodatabase named InteroperabilityConnection.gdb as part of the process. I make this intermediate data (and hence be automatically deleted). Make sure your geoprocessing settings allow for overwrite of existing datasets by checking the setting in Options. Of course memory workspace data will not persist between sessions, but you can easily copy it to a project database, and during a session the data will be very performant. You might like to share a project template with this toolbox in it with your colleagues. Naturally Data Interoperability extension must be installed and licensed for now, but we have some news coming in a later release about another equivalency issue from ArcMap that might allow some exceptions. To give you confidence, here is a screen shot of a million XY events coming from a Snowflake instance: But wait there's more! A powerful function in Data Interoperability is to use the web as a file system, not just FTP and HTTP but cloud storage platforms. For example I uploaded a folder of thousands of Mapinfo TAB files to OneDrive, and using the file browser can choose: Then in my OneDrive connection select a TAB file: Have fun interoperating! P.S. Thanks to my colleague Emily K. for test help. Edited December 23rd 2020 to replace the download - the models in the toolbox had a fieldmap with persisting entries.
... View more
12-09-2020
09:14 AM
|
1
|
2
|
3175
|
|
POST
|
Adam you found the recommended way to get the area feature attributes onto the points, maybe the merge behavior default changed between releases. Please keep using Pro anyway.
... View more
12-04-2020
06:18 AM
|
1
|
1
|
4151
|
|
POST
|
Adam for the first part of your question, attribute names are case sensitive so you'll need to create an attribute connection from NAME in your data to Name on the writer. This is easy manually but if you had dozens to do there is also an automated way. You have found the feature type fanout, it is 'Fanout by Attribute' in the old UI. Here is a good topic for the current experience: https://docs.safe.com/fme/html/FME_Desktop_Documentation/FME_Workbench/Workbench/Setting_Feature_Type_Fanout_Properties.htm?Highlight=fanout
... View more
12-03-2020
11:43 AM
|
1
|
3
|
4176
|
|
POST
|
Yes that's right. I almost always use that setting in case my target feature class already exists, it functions like 'drop if exists then create'.
... View more
12-03-2020
05:57 AM
|
1
|
0
|
3064
|
|
POST
|
Hello Adam Thanks for persevering with our dated training materials, we have transitioned to leveraging Safe Software's courses at present. For the first part of your question, we renamed the long form of the ArcObjects-based file geodatabase writer to drop the word 'ArcObjects', we thought it didn't add anything. The short form of the format - GEODATABASE_FILE - is unchanged. For the second part of your question it is the Table Handling parameter that has the options you are looking for: Thanks for visiting the Data Interoperability forum, there are some blogs to help you with your journey!
... View more
12-02-2020
12:31 PM
|
1
|
2
|
3079
|
|
POST
|
Hello Eric. Can you please manually copy the fds file into your profile directory C:\Users\<username>\Documents\FME\Formats, refresh your fdl connection and check your fdl files contain data? Let me know directly via bharold at esri dot com, thank you.
... View more
12-01-2020
10:37 AM
|
1
|
1
|
1358
|
|
POST
|
For others interested in DATEX, Data Interoperability can write the required XML guided by the XSD documents available here: https://datex2.eu/schema/3/
... View more
11-30-2020
08:15 AM
|
1
|
0
|
1703
|
|
POST
|
Hello Veronique A quick look at the DATEX II documentation shows it is XML-based with published XSD documents so in theory Data Interoperability can write DATEX II, but personally I have not done so. I can see web traffic from partners who have. If you email me as bharold at esri dot com we can discuss your requirements in more detail and get you connected to the right people.
... View more
11-24-2020
10:03 AM
|
0
|
0
|
1751
|
|
BLOG
|
At the ArcGIS Pro 2.7 release in December the FME engine in Data Interoperability extension is FME 2020.2. The corresponding release of Data Interoperability for ArcGIS Enterprise with the same engine is Enterprise 10.9 pre-release and available simultaneously with Pro 2.7, but if your organization cannot implement pre-release software and you will be using Enterprise 10.8.1 (which uses FME 2020.0.1) until 10.9 ships then Spatial ETL web tools or Workbenches run on a server may fail if either of these conditions is met: The tool uses a format that is not supported by Enterprise 10.8.1/FME 2020.0.1 The tool uses a transformer which has a new version number at FME 2020.2. There is no workaround for the first situation and the affected formats are: Microsoft Direct Draw CityJSON LDAP/Active Directory Mapbox Vector Tile Mapbox Vector Tileset Mapbox MBTiles Vector Tiles Planet Basemaps Snowflake Spatial 12D Model There is a workaround for the second situation, namely to substitute transformers from the FME 2020.0.1 engine when authoring your Pro 2.7 Spatial ETL workspace. These are available in the blog download, simply open the containing tool by editing it and copy/paste the transformers you want into your workspace. The affected transformers are: FeatureReader JSONExtractor JSONFlattener JSONFormatter JSONFragmenter JSONUpdater JSONValidator LocalCoordinateSystemSetter SQLExecutor StatisticsCalculator TCLCaller We apologize for this temporary situation which will be resolved when ArcGIS Enterprise 10.9 ships.
... View more
11-13-2020
09:30 AM
|
0
|
0
|
1137
|
|
IDEA
|
We ship GDAL in our Python distribution and I just checked gdal.BuildVRT is in there so you could build a script tool that writes VRT.
... View more
11-10-2020
09:09 AM
|
0
|
0
|
2563
|
|
BLOG
|
Here is where it rained today in Australia, a bit around Perth, a bunch in Queensland and for the sharp-eyed a little at Cape York to keep the prawns and crocs happy. The legend shows mm precipitation. The Australian Bureau of Meteorology publishes downloadable weather data; my scenario is I'm interested in republishing rain gauge observations to a hosted feature service, which is in the map above. After a little digging around I find the data is available via FTP with a schema described in this user guide. The data is refreshed daily. While I'm not redistributing the data in this blog I'll mention it is licensed under Creative Commons terms so you can implement my sample if you wish. While the refresh rate is daily each file can contain observations spanning more than 24 hours and from multiple sensors at a site. Anyway, what I wanted was the daily observations pushed into a feature service in my portal; I could just as easily send the data to ArcGIS Online. These periodic synchronizations from the web are everywhere in GIS. Data custodians make it easy to manually obtain data, I'm going to show you how simple it is to automate synchronizations with Data Interoperability. I usually describe Data Interoperability as Esri's 'no-code' app integration technology. Full disclosure, in this sample I did resort to some Python in the FME Workbenches I created, so I have to back off the no-code claim, but I can say its low-code. You can see for yourself in the Workbenches. I started out thinking I would make the scheduled process a web tool and schedule it with Notebook Server. That might be the most fun to build but I realized it just isn't called for in my use case. I fell back to a pattern I previously blogged about, namely using Windows Task Scheduler, but this time on a server. Why not use the desktop software approach? Well, to take advantage of a machine with likely very high uptime I know can be scheduled outside my normal working hours. Here is the Workbench that does the job of downloading the BoM product files and sending features to my portal's hosted feature service: And I can't resist it, here is the Python, not too scary. It would be unnecessary if the filenames at the FTP site were stable, I could have used an FTPCaller transformer, but they have datestamps as part of their name, it was just easier to handle that with some Python. While I was downloading the data I also cleaned it up a little (removing enclosing double quotes and newline characters) then sent all observations out into the stream. As the data sources don't change I made all parameters private, this simplifies the command to schedule. All that I need to do is get the Workbench onto the server and make sure it runs. In the post download you'll find three FMW files: MakeRainGauges.fmw RefreshRainGauges.fmw RefreshRainGauges - Server Copy.fmw MakeRainGauges creates a geodatabase feature class I used in Pro to instantiate my hosted feature service. RefreshRainGauges is built from MakeRainGauges and only differs in that it writes to the feature service, with initial truncation. That's the Workbench I want to schedule. RefreshRainGauges - Server Copy only differs from RefreshRainGauges in its Python settings, to use Python 3.6+. I didn't use that name on the server, just to get it into the post download. On my portal server there was a little setup (I have a single machine with everything on on it, don't forget to install and license Data Interoperability!). RefreshRainGauges uses a web connection to my portal. In this blog I describe how to create a portal web connection. This has to be copied to the server for the arcgis user which will run the scheduled process. The simplest way is method #2 in this article. Logged onto the server as arcgis, I first created a desktop shortcut to "C:\Program Files\ESRI\Data Interoperability\Data Interoperability AO11\fmeworkbench.exe", started Workbench, then imported the web connection XML file and tested reading the feature service. I also copied RefreshRainGauges to a folder and edited it to adjust the Python environment to suit the server (the sample was built with Pro 2.7 Beta but the server is running Enterprise 10.8.1). Running the workspace interactively, the top of the log reveals the command to be scheduled: Command-line to run this workspace: "C:\Program Files\ESRI\Data Interoperability\Data Interoperability AO11\fme.exe" C:\Users\arcgis\Desktop\RefreshRainGauges.fmw The rest is simple, just create a basic task, the hardest part was figuring out what time to execute the command (I see data changing as late as 1AM UTC so I went with 6PM local time on my server, which is in US West). Make sure the task will run if arcgis is not logged on, and the arcgis user will need batch job rights (or be an administrator, which I can do on my VM machine but you likely will not be allowed to do). That's it, automated maintenance of data I can share to anyone! I finish with a screen shot of the processing log from last night's synchronization:
... View more
11-09-2020
10:18 AM
|
1
|
0
|
2572
|
|
DOC
|
Since this thread pre-dates the availability of webhooks in hosted feature layers I’ll offer this option for those with Data Interoperability extension, which has emailing capability (not discussed in the post): https://community.esri.com/community/arcgis-data-interoperability/blog/2020/09/29/power-your-integrations-with-arcgis-online-feature-service-webhooks-and-data-interoperability
... View more
11-09-2020
06:13 AM
|
0
|
0
|
14307
|
|
POST
|
Hi Jamal, the FeatureJoiner transformer in Data Interoperability extension has this capability. FeatureJoiner
... View more
11-06-2020
06:41 AM
|
2
|
1
|
4152
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 10-03-2025 05:45 AM | |
| 1 | 11-21-2025 05:34 AM | |
| 2 | 10-06-2025 05:36 AM | |
| 1 | 11-03-2025 05:14 AM | |
| 3 | 11-04-2025 08:41 AM |
| Online Status |
Offline
|
| Date Last Visited |
yesterday
|