|
POST
|
Hi, the simplest approach is to use an Aggregator and merging existing attributes, you will get one feature with all the attributes available on the input features.
... View more
11-23-2022
12:56 PM
|
0
|
0
|
1033
|
|
POST
|
Yes of course, Data Interop for Pro works, you just have to make sure you install at the same release.
... View more
11-18-2022
11:46 AM
|
0
|
0
|
3042
|
|
IDEA
|
Does the Geo Data Uploader have an API or is it just a standalone tool?
... View more
11-08-2022
06:39 AM
|
0
|
0
|
1196
|
|
POST
|
I think you should open a support call, it should run provided Data Interoperability is installed on the machine you're running the task on. I would check arcpy imports (i.e. licensing is OK) especially by instrumenting the script by writing out arcpy.AddMessage().
... View more
11-08-2022
06:25 AM
|
0
|
0
|
1123
|
|
BLOG
|
Sorry I wasn't clear in the post, CheckoutExtension is an ArcPy function you would include in a Python script if you were using this approach and Windows Task Scheduler and a tool that uses a Data Interoperability ETL tool. https://pro.arcgis.com/en/pro-app/latest/help/analysis/geoprocessing/basics/schedule-geoprocessing-tools.htm If you're using the FME product CheckoutExtension would not be needed, it has its own license technology.
... View more
11-04-2022
06:44 AM
|
0
|
0
|
1851
|
|
POST
|
Hi If you relationships are GlobalID based (we only just added support) please try this workflow: https://community.safe.com/s/article/writing-arcgis-geodatabase-attachments
... View more
10-19-2022
06:41 AM
|
1
|
0
|
1111
|
|
POST
|
It is also possible to "tell" the Clipper that it doesn't have any clipping features if you say the clippers arrive first so leave the default to No.
... View more
10-14-2022
06:31 AM
|
1
|
0
|
2625
|
|
POST
|
Hi Suzy, I see a lot of threads on the web with different issues, please log a support call so the analyst can follow up. I haven't seen this issue but it has been a while since I had Pro 2.7 on a machine.
... View more
10-13-2022
06:45 AM
|
0
|
0
|
1186
|
|
IDEA
|
The Data Interoperability extension has a GeometryValidator transformer that returns the locations of errors.
... View more
10-06-2022
06:20 AM
|
0
|
0
|
503
|
|
BLOG
|
I had another question - can raster data be sent to ArcGIS Enterprise the same way? Yes! For example in a GeoPackage, you just have to make sure the .gpkg item is cast from file to workspace in your web tool before reading the data with the Select Data model tool.
... View more
10-05-2022
11:33 AM
|
0
|
0
|
4993
|
|
BLOG
|
I took some questions off line which I'll answer here for the record. Can FME on Linux or MacOS be used? Yes! The service based approach isolates client and server architectures so you can use any FME architecture, Windows, Linux or Mac. Can ArcGIS Enterprise Linux edition be used? Yes! The situation at the server end is the same as for the above question.
... View more
10-05-2022
11:12 AM
|
0
|
0
|
5003
|
|
BLOG
|
A recent blog of mine discussed how ArcGIS Data Interoperability relates to FME but also clarified the licensing situation when FME Flow is being used to access ArcGIS executables, particularly for writing to an Enterprise geodatabase. In that situation, other than for the file geodatabase exception noted, the ArcGIS software to be used must be ArcGIS Server or Enterprise. This isn't always convenient for an FME Flow site, so in this blog I'll give you an alternative - namely using a web tool without a local ArcGIS install. Here is my example, a model tool: Web Tool Source Model The whole point here is to give FME Server users a way to write Enterprise geodatabase objects using a web service and not an executable. If your interest is in reading Enterprise geodatabases with FME Flow you can already do this with the ArcGIS Server Feature Service reader provided you publish a feature service from an Enterprise geodatabase object. Let's get started. The important bits in the above Model are: The input workspace parameter is the target output Enterprise geodatabase The input data parameter is a file-based format FME writes and ArcGIS reads At writing, the teams at Safe Software and Esri are putting effort into enhancing the FME EsriJSON reader/writer to fully support a schema object, but we're not done yet. Regardless, I'm going to use EsriJSON as my data protocol. I made test data with the Features to JSON geoprocessing tool, not FME yet. There are other options like GeoPackage you might choose, and more coming in the future, but I want to show you how to use EsriJSON for now, not least because you will be able to trust the geometry and schema to be reliable in a destination geodatabase. Is there anything special about the Enterprise geodatabase workspace parameter? When you add it to your model it is just a database connection file with ".sde" extension but put a little thought into it. I made a new connection file with the Manage Registered Data Stores dialog, and indeed registered my Enterprise geodatabase as a data store. When publishing a web tool, datasets may be copied to the server or referenced from data stores. If your web tool needs to leverage any data that isn't static then do what I did. Making the Enterprise workspace a parameter protects it from being copied but any other datasets need to have this behavior managed. Why is the EsriJSON file zipped? Obviously this helps file upload speed but it is also necessary because of how we use the web tool - i.e. via its REST endpoint and calling from FME Server. When a web tool supports a file input parameter the file must be supplied as a URL or server or portal item and not a POST upload. Any JSON item uploaded unzipped will be recognized (incorrectly in our case) as GeoJSON, found to be invalid and rejected. Zipping it protects it from this behavior. Even then, EsriJSON isn't a supported portal item type and we have to tell the system it's a Code Sample item. This is a bit ironic, given I'm the no-code guy in the office, but I'll take the leave pass. My web tool doesn't do any fancy geoprocessing, it only serves to demonstrate the EGDB writing approach, but we'll step through it for completeness. The first Calculate Value model tool downstream of the workspace variable generates a unique output feature class name for the target geodatabase. The only notable thing is I make sure to use raw string processing for the workspace variable (r"%DataStoreDBA%") because on the server it is a complex path with metacharacters like "\a" that would otherwise be interpreted. For example: C:\arcgisserver\directories\arcgissystem\arcgisinput\EsriJSON2EGDB.GPServer\extracted\p30\DataStoreDBA1.sde\FromEsriJSON0 First Calculate Value Tool The second Calculate Value model tool downstream of the data file input unzips the data to the scratch folder. Second Calculate Value Tool On the server an example path in the job folder looks like: C:\arcgisserver\directories\arcgisjobs\esrijson2egdb_gpserver\jf37903613c4f4e5685efc5e0861fdbbd\scratch\FeaturesToJSON_OutJsonFile_Nelson.json Otherwise the only processing is the write we want - the JSON to Features geoprocessing tool sends data to where we want it (your processing may be much more complex). After a couple of executions I have data in my EGDB: Data in the Enterprise GDB After publishing a tool result as a web tool I have a portal tool item with a REST submitJob endpoint: SubmitJob Endpoint Now we have something FME can use via HTTP to write data to the Enterprise geodatabase. Before we dig into that there are a couple of things to consider. In this workflow you are responsible for managing timeouts and contention in the data going up to the portal (where the REST endpoint gets its input data), processing, and in the target database. I went into my Server Manager console and made sure to limit my web tool to a single instance with a 15 minute timeout. That way I can handle large files and always one at a time. Drum roll please - here is the FME workspace (Main canvas) that calls the web tool: FME Workspace ...and the looping custom transformer in the middle looks like this: Looping Custom Transformer Everything is in the blog download (authored in Pro 3.0) but let's step through the FME workspace. In your real world, FME Flow will be cooking up data to be written to the EGDB. I don't do that in my sample but you will have to make file-based data to send. My input parameters are a user-defined zipped JSON file and the web tool home URL (not published, as it is static): https://dev0015493.esri.com/server/rest/services/EsriJSON2EGDB/GPServer/EsriJSON2EGDB The first step is to get a portal token, that's the first (green) transformer. Then the input zip file is uploaded as a portal Code Sample item. Code Sample items contain a file that must have a unique name per folder, regardless of whether the Code Sample item name is itself unique. Just in case of a name collision I delete any existing items with the same content name (found by rejection), then upload the input data. After that it's just HTTP to send each job, a looping custom transformer to wait for job completion, and then I delete the portal item we are finished with and finally email a result message to any interested parties. Now I'm off to the races, any time my FME job runs I get data written to my EGDB, and without installing any ArcGIS software on my FME machine! More Output Data Don't forget EsriJSON is a work in progress at Esri & Safe Software but you can get going now with other file-based formats.
... View more
10-04-2022
01:01 PM
|
0
|
2
|
8754
|
|
IDEA
|
Good idea Matthew! I tackled the problem recently too: https://community.esri.com/t5/arcgis-data-interoperability-blog/building-a-data-driven-organization-part-14-total/ba-p/1164899
... View more
09-28-2022
07:15 AM
|
0
|
0
|
848
|
|
BLOG
|
We do our best to automate things but sometimes you need to wire up FME-technology parameters with Esri geoprocessing for yourself. ModelBuilder is your friend. Here is my example, an ETL tool process needs help from a couple of Calculate Value model tools. Let's see why. Modelbuilder making it all work A customer asked how to build a web tool to power a Web AppBuilder geoprocessing widget, it had to work a certain way. Excel files needed to be uploaded and zipped CSV data had to be returned, not just one zipped CSV file per Excel input file but multiple CSV files in each zipfile. Okay.....no map in sight, just my kind of job! You can pretty quickly stub out a Spatial ETL tool that applies any business logic you need to go from Excel to CSV (the two most popular non-spatial dataset types in the ETL world by the way). But, what about zipping them up and returning the zipfile as an output? FME technology has some paradigm differences with ArcGIS. FME Workbench tends to work with what the Esri world thinks of as workspaces - folders and databases. When you configure an ETL tool you supply input and output datasets accordingly, but to the ArcGIS geoprocessing environment everything looks like an input parameter. How do you get data out for downstream geoprocessing or as a result in an app like Pro? The simplest way is with ModelBuilder as part of the web tool. If your workflow is writing to a geodatabase then put your ETL tool in a model and add a Select Data model tool to extract the features you want. You'll then have geoprocessing outputs available. For my example I had to get a little more creative. The first challenge wasn't around parameter types, it was zipping multiple CSV outputs per input. Many formats in FME technology support conversion to a zipfile. The way to do this with CSV data is supply a zipfile name as the last part of the output dataset "folder". Here is my ETL tool workspace: Excel to CSV With Feature Type Fanout And here is my output "folder": CSV Writer Folder Path With Zipfile Path If you look closely at the workspace you'll see I trigger a feature type fanout based on an attribute value - see the CSV File Name is an expression @Value(Environment). This gives me one output CSV file per unique attribute value. In the writer parameters I specify a path ending in "Output.zip". In operating system terms it is a file but to Workbench (and ArcGIS when calling the ETL tool) it is still a folder. To make everything work as a web tool I need correct ArcGIS parameter types, folder input and file output. Folders do not work as outputs. I also want to specify a scratch file location for the output so that the job environment can see and return the data. The trick is to wrap the ETL tool in a model and set paths and parameter types explicitly. The two Calculate Value model tools are key. The first one creates a zipfile path in the job scratch environment but casts it to Folder: Create Scratch Folder Zipfile Path The second one takes the same path and casts it to File: Folder Cast To File Then everyone is happy. I can run the tool in Pro, it uses the local scratch environment, then I can share the result as a web tool and it will use the ArcGIS Server job scratch environment at run time. I get my data back in Pro and use Web AppBuilder against the web tool. Web Tool Output In Pro Simple, isn't it! Now you can build your Spatial ETL web tools into your enterprise geoprocessing workflows.
... View more
09-26-2022
08:13 AM
|
4
|
0
|
2540
|
|
POST
|
Are your Pro and Data Interoperability installations at the same release? I see an atbx in your project but 2.9 for Data Interoperability.
... View more
09-21-2022
02:00 PM
|
0
|
1
|
3250
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 11-21-2025 05:34 AM | |
| 1 | 10-06-2025 05:36 AM | |
| 1 | 11-03-2025 05:14 AM | |
| 3 | 11-04-2025 08:41 AM | |
| 1 | 10-23-2025 01:24 PM |
| Online Status |
Offline
|
| Date Last Visited |
a week ago
|