|
POST
|
Tom in the Workbench application, open the Tools menu and in the Translation options set the Python compatibility to agree with Pro - the option named "Esri ArcGIS 3.11" or similar, depending on version.
... View more
04-08-2025
06:03 AM
|
0
|
0
|
523
|
|
POST
|
Hello Kathy A simple workaround is to refactor your workspace to use the Esri ArcGIS Portal Feature Service reader/writer with dataset https://arcgis.com and the Generate Token authentication type, which will embed the credentials. Database credentials can be similarly embedded in reader/writers. Let us know how you get on.
... View more
04-02-2025
07:00 AM
|
0
|
0
|
535
|
|
POST
|
Hello Thomas If you clone your ArcGIS Pro 3.4 python environment then extend it with scipy, and set the runtime environment to the clone, and in Workbench translation options set your preferred Python interpreter to Esri ArcGIS Python 3.11, scipy will be available.
... View more
03-18-2025
05:37 AM
|
0
|
0
|
1563
|
|
POST
|
Hello Joe The ArcGIS Pro 3.4.3 patch, planned for March 18th, subject to testing, fixes this issue. Pro will update when the patch becomes available. Regards
... View more
03-03-2025
08:47 AM
|
0
|
1
|
1380
|
|
BLOG
|
Sometimes you want to get complex data into an ArcGIS Pro session, but on a trial basis - say into the built-in memory workspace, for inspection and investigation in maps and/or table views. Pro's Analysis ribbon Data Interoperability controls include the Data Inspector and Quick Translator apps, which support viewing and translation, but these run outside the Pro process, and while informative, require extra steps to make use of the data in Pro. ArcMap had a concept of an "interoperability connection", which could cache data locally, but wasn't as performant as you might like. However, it inspired this post, so hats off to ArcMap once again! See below a simple model (in the Pro 3.4 toolbox in the blog download) that leverages the Quick Import system tool, delivered with ArcGIS Data Interoperability, which will read any of hundreds of formats of data supported by the extension, and any you have configured as custom formats, and write the data into the memory workspace for instant access in your Pro session. QuickImportToMemory The processing is simple. A Quick Import tool supports the data source input parameter, which lets you pick a format and source location. What is saved in the tool is an example of GeoJSON at an API URL with filter parameters - building permits issued in Vancouver, BC to date in 2025: Dataset input dialog Quick Import creates an intermediate file geodatabase which submodels inspect for feature class and table outputs: SubModelFeatures SubModelTables The found data object paths are returned to the parent model and exported to memory, then collected to be model output parameters. The Collect Values model tool has the handy property that it suppresses visibility of output parameters so they don't clutter the model dialog when run as a tool. Tool dialog has only an input parameter The intermediate file geodatabase is cleaned up when the data is in memory. Features and table in memory See in the Contents pane a feature class and table (a few GeoJSON features without geometry) were output by Quick Import and exported to memory. So that's it - an easy button to get complex external data into Pro! There are a few limitations, principally that selecting multiple input datasets will result in the processing of only the first. Datasets like the GeoJSON example saved in the tool which result in multiple outputs should work. To support complex logic in this workflow, you can use custom formats. See also here. Custom formats let Quick Import work with raw data using any logic built into the format by an external ETL workspace. Please comment in the blog with your observations!
... View more
02-26-2025
12:59 PM
|
2
|
0
|
952
|
|
BLOG
|
It is always satisfying to share powerful new ways to solve problems, especially so when the solution has been "hiding in plain sight" for a while. This time I'm showing how the combination of ArcGIS Enterprise branch versioning and cloud native data sharing delivers not only fast data access, to people without portal access, but the ability to ask the accessed data to travel back in time to when it was younger. Like these parcels, see a previously undivided parcel and now its three subdivisions. Parcel subdivisionParcel subdivision Picture a dataset with millions of features and under heavy daily maintenance, like branch versioning is built to handle, your customers can access all or any part of the default version for any moment in time. Forever. Without extra load on your Enterprise portal. So, how did I get there? I simply noticed that the insert-only transaction model of branch versioning is a fit for incrementally creating GeoParquet files in cloud storage that jointly preserve data state over time and can be queried spatially and temporally to make local data on demand for your area and time of interest. It is however a very fancy query! The good news though is you don't have to figure it out, the blog download has a notebook with examples for my parcel subject matter, just plug yours in. I didn't have to invent the query approach, Esri publishes workshop materials on the topic. For example, if you go to around minute 18 in this presentation you'll see what such a query looks like. I did have to make GeoParquet files that I can query, and a maintenance workflow for initial and incremental parquet file creation . It all starts with the source branch versioned Enterprise geodatabase feature class. Normally you can't see the system fields that power branch versioning of a feature class, but if you add the archive class to the map they are available: Archive class added to the mapArchive class added to the map A couple of things to note in the fields map: ObjectID is demoted to an ordinary long integer (values are not unique any more) and various fields named GDB_* are added. They power viewing the data at a moment in time, which is how branch versioning works - the latest state for a feature wins, which may be a deleted state, but the data history isn't lost (unless you drop it), which makes time travel possible. The archive class is also good for discovery of what edit moments are in your data. With the archive class providing visibility to all fields, the sharing and maintenance workflow was possible. It goes like this: Create an initial parquet file with all archive class rows where GDB_BRANCH_ID = 0 On any schedule that makes sense, create delta parquet files for new default branch row states These have a GDB_FROM_DATE later than the maximum in all existing parquet files They also have GDB_BRANCH_ID = 0 Maintain all parquet files in your favorite S3-compliant object store at a glob path Give your data customers a notebook or script tool they can use to extract data The supplied notebook requires DuckDB version 1.0.0 in the Python environment Now, I'm advertising this as cloud native data distribution, but at writing I'm still setting up my AWS account so the attached notebook is using a local filesystem path, I'll update that when I have a public S3 URL path available. In the meantime you can download sample data for testing here, here, here and here. They are the initial bulk version copy and a few incremental delta files, with a few days edits each. Change the notebook pqPath variable to suit your environment until I get the S3 path in place. Spoiler The data I'm using isn't really being maintained in a branch versioned geodatabase, I made sample data, kindly see data permissions in the item details for the links above. The data I'm using isn't really being maintained in a branch versioned geodatabase, I made sample data, kindly see data permissions in the item details for the links above. You'll see in the notebook I supply a template for extent and time travel queries. I find I can extract all 2.7 million parcels in my data in a little over 3 minutes, from local disk. Access from S3 I would expect to be a little slower, we'll see when I have that set up. Try out the notebook for yourself. You might have some questions about the notebook, I'll see if I can anticipate a few: DuckDB 1.0.0 is used as it is in the Esri Conda channel and later versions handle geometry differently The bbox column in the parquet files is JSON type but queried as varchar as DuckDB didn't seem to recognise the data as JSON I tried using the built-in rowid pseudocolumn in DuckDB but got errors, so I overrode it I tried writing the output feature class by bouncing through a spatially enable dataframe but got errors In the blog download the project atbx has a script tool I used to find desired output text field widths Now I'm going to be a little selfish. To make my sample data and parquet files I built a few ETL tools (Pro 3.4), which I could have scripted. These tools are not in the blog download. If you are interested in them please message me and I can share. It will help the team here if we hear how many people are interested in this data sharing paradigm, so please help us to help you.
... View more
02-20-2025
09:00 AM
|
3
|
0
|
1163
|
|
POST
|
I think this could be modeled or scripted, but it might make more sense to use the LocateXT tools on the incoming Word documents before attaching them, to make accessory attachments that go for the ride. Data Interoperability doesn't add anything in this situation, as described, but might if other complexities arise.
... View more
02-14-2025
06:12 AM
|
0
|
0
|
799
|
|
BLOG
|
Ah I see, sounds like a portal bug indeed. You could always shell out to ArcPy in the workspace, assuming the Geocode Addresses geoprocessing tool likes your portal locator, if not then use a file-based locator.
... View more
02-07-2025
10:08 AM
|
0
|
0
|
2188
|
|
BLOG
|
Hi, unless I'm missing something, if you change your Authentication Type to Generate Token then you'll embed your username and password into the Geocoder and a new token will be generated on each run.
... View more
02-06-2025
11:35 AM
|
0
|
0
|
2221
|
|
POST
|
Sorry I was not clear, if you start the Workbench application from the Analysis ribbon or ArcGIS program group then you'll see the Tools>FME Options>Default Paths dialog. I think I see a bug though, while database connections persist for the Workbench app they do not for Quick Translate or the system Quick Import tool - I will investigate.
... View more
01-22-2025
07:28 AM
|
0
|
0
|
893
|
|
POST
|
Remy, I think this applies to Pro 3.3, if not please reply, but in the Tools>FME Options>Default Paths control you will see your connection storage options. You must have permissions for these locations. https://docs.safe.com/fme/2024.1/html/FME-Form-Documentation/FME-Form/Workbench/options_default_paths.htm
... View more
01-22-2025
06:20 AM
|
0
|
2
|
905
|
|
POST
|
Hi, if you're using the REST endpoint, then the input parameter needs to be a URL to a file (perhaps at a UNC path), or a server/portal item id. https://developers.arcgis.com/rest/services-reference/enterprise/gp-overview/#input
... View more
01-06-2025
06:16 AM
|
0
|
0
|
727
|
|
DOC
|
This post is a stepping stone between the ETL Patterns community space and Esri's Well-Architected Framework site, where theory becomes practical advice on how to implement the ArcGIS system. Specifically, this document relates to the integration pillar Data Pipelines and ETLs topic. If you browse the topic you'll see many ETL patterns described, but as the architecture site isn't designed as a software repository, the patterns link back to this community for samples you can explore and implement. To build the content, the document attached to this post, and shown below, was constructed. It is a tabular summary of ETL modalities across ArcGIS. You will likely have your own experience to draw upon and disagree with the software suggestions, if so please comment in this post! We hope to keep this "framework" under construction with the involvement of everyone. Inbound ETL Patterns
... View more
12-19-2024
11:59 AM
|
0
|
0
|
1096
|
|
POST
|
H Doug, can you please open a support call. I know that sounds like a canned response but in this case we'll have to rope in geoprocessing team and they need something to track for their work. Thanks for reaching out.
... View more
12-12-2024
09:02 AM
|
1
|
0
|
505
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 10-06-2025 05:36 AM | |
| 1 | 11-03-2025 05:14 AM | |
| 3 | 11-04-2025 08:41 AM | |
| 1 | 10-23-2025 01:24 PM | |
| 2 | 10-22-2025 09:17 AM |
| Online Status |
Offline
|
| Date Last Visited |
Monday
|