|
POST
|
Hi Tom, to write to a scratch location on your server and make use of the data during translation use this approach: https://community.esri.com/t5/arcgis-data-interoperability-blog/building-a-data-driven-organization-part-14-total/ba-p/1164899 We have a development issue to adopt the job folder scratch environment in ETL tools but it isn't there yet, so please control this as above. To fix your ArcPy issue log into the server as the arcgis service owner, start Workbench from the fmeworkbench.exe executable and set your FME Options>Translation>Prefered Python Interpreter to Esri ArcGIS Python <version> and in your ETL tools that you publish do the same and also always set the tool Scripting>Python Compatibility to Esri ArcGIS Python <version>.
... View more
09-02-2022
07:07 AM
|
0
|
1
|
1060
|
|
BLOG
|
My first flight since the pandemic hit was a trip to Vancouver, Canada last week, to attend our partner Safe Software's 2022 FME User Conference, where the no-code integration & ETL community could connect and learn. I took a question at the event that prompted this blog - why does my change detection workspace think every feature has changed when I know only a few features really changed? In case you're new to this, ArcGIS Data Interoperability inherits a very powerful function from FME, namely change detection, embodied in the ChangeDetector transformer, or less frequently, the Matcher transformer. Change detection is usually done between a source dataset and a derived dataset that is maintained as a mirror on a schedule, and the ability to write only data changes (inserts, updates, deletes) without downtime is a great advantage, especially for targets like feature services, which can remain in use during maintenance. This blog is about avoiding surprises when performing change detection. Back to my story, I had a ticketing problem that caused a delay at the gate for my northbound flight; the airline offered me a seat upgrade, so I chose a port side window seat to enjoy the view do some ground truthing on some data I was working on, namely fire locations in California. My flight was in daytime but here is a dark theme map of fire locations during my trip. Fires in California There are a few places to get near real time fire data, the source isn't relevant to the topic and what I relate isn't tied to any data format. I happen to be reading JSON and maintaining a hosted feature service. Let's tackle the #1 cause of fake change detection - geometry storage and retrieval. First I will illustrate the issue. The blog attachment contains a CSV file with a single row, add the CSV table to a map then use the table context menu Display XY Data tool to create a layer from the XY fields, using WGS84 coordinate system. Throw a little Python at the resulting layer and you will see what happens to the XY coordinates used to create the layer: with arcpy.da.SearchCursor('Fires_XYTableToPoint',"*") as cursor:
for row in cursor:
pass
row
(1, (-120.70132999999998, 39.331022000000075), 1169, -120.70133, 39.331022) So, what went in as -120.70133, 39.331022 was stored as -120.70132999999998, 39.331022000000075. The data isn't being randomly shifted, it's just an artefact of how ArcGIS manages coordinates when it has to make some assumptions, and it has no practical implications except for our change detection case, because unless you say otherwise, coordinate changes will be detected with total strictness. I'm maintaining a mirror of the source fire event JSON feed as a feature service and I have a Spatial ETL workspace built to do it. I'm reading the JSON on a schedule and writing only changes to the target feature service, the process uses a ChangeDetector. Below is a screenshot of my ChangeDetector properties, take a note of the geometry handling. I am checking for 2D differences, my data are points so lenient point order isn't relevant, I am not checking coordinate system names (note identical coordinate systems on inputs using different names will cause "fake change") but most importantly I use a vector geometry tolerance of 0.000001. If I used the default of zero any difference right down to subatomic particle size will be detected as a change, and we have seen above how that kind of number change can be introduced. Change Detector How did I arrive at a vector tolerance of 0.000001? My "personal defaults" for what are real geometry changes are at 3 decimal places for projected data and 6 decimal places for geographic data. You may work to tighter tolerances but I have never seen such precision in production use. My data are in WGS84 geographic coordinates so I use 6 decimal place tolerance. If you prefer to make this aspect more visible in a workspace you can use a CoordinateRounder in each input data stream with the same tolerances and use a vector tolerance of 0 in the ChangeDetector. So that lets you avoid the #1 change detection pitfall, fake geometry "changes". Are there similar precision-based ones out there? Yes! You'll see in my ChangeDetector I have set a name for the Generate Detailed Changes list parameter. This causes the transformer to output a list that exposes the actual data values driving change. I have a field "date_created" in my data and I notice every feature seems to change, but only at the microsecond level - definitely fake change. Here is a log excerpt: Attribute(string: UTF-8) : `delta{1}.action' has value `modified' Attribute(string: UTF-8) : `delta{1}.attributeName' has value `date_created' Attribute(string: UTF-8) : `delta{1}.originalValue' has value `20220816155935.732000' Attribute(string: UTF-8) : `delta{1}.revisedValue' has value `20220816155935.732935' I have no idea why this timestamp field is moving around, and I cannot know, you'll find yourself in this situation occasionally too, so just figure out ways to work around it. The simplest approach is to drop the fractional seconds from each date_created value. It may be tempting to round the value but datetimes are tricky things, they are not always valid if rounded up (think 59.999999 seconds rounding up to 60 seconds, which should be 0 seconds in the next minute). So I chop off the fractional seconds with a StringReplacer defined as below in both input streams to my ChangeDetector: Remove fractional seconds Sorry to throw regex at you but sometimes it's necessary. So that was another precision-based issue we avoided! There are other things to remember when performing change detection; the detailed change list is your friend, it will help you track down what is really changing. Be sparing with the fields you use to detect changes too, accidentally leaving in format attributes for example will cause bulk change to be found when in fact the changes aren't really in your data. Once you have mastered change detection you'll be able to author very efficient workspaces, for example here is mine finding real changes in the fire data and writing only the updates. You may spot that all features changed in this run (29 out of 29) which is a hint I may have more work to do... Fire feature service maintenance with change detection One more key point. The ChangeDetector transformer outputs a format attribute fme_db_operation with values of INSERT, UPDATE or DELETE; when writing to feature services or databases this row-level attribute can be used by the writer to determine feature handling, set this behavior by setting the Feature Operation property in the writer - you don't have to have multiple writers for each mode. Using fme_db_operation Don't put up with fake change!
... View more
09-01-2022
08:19 AM
|
2
|
0
|
2257
|
|
BLOG
|
I'm coining a new term here - Total Integration - meaning the continuous, automated integration of any data and any ArcGIS service, this concept combines data movement with refreshing the web services the data powers. No manual ETL, no disconnected processes after the ETL, no downtime, just continuous Enterprise Service Bus maintenance the way it should be done, in one process. It has become trite to say you're "excited" to share details on functionality, but I really do feel using a Notebook as a lightweight scheduler will bring web tool automation to the masses. My example scenario happens to be about maintaining a portal locator (i.e. geocode service), a very common foundational service, but read between the lines and picture your full suite of services under continuous management no matter how the data powering them changes. Of course, you don't have to be maintaining a service, your web tool may be doing any sort of geoprocessing. San Francisco Locator Data From Socrata There are a few parts to total integration, the trick is making them work together. The logical steps are: Publish a web tool that moves your data and refreshes your target web service (if applicable) Automate the web tool on a schedule In my case I use ArcGIS Data Interoperability in the web tool but that's just an artifact of my data source. The lightbulb moments for me were realising I could combine ETL with core geoprocessing to include service refresh in my web tool and also use a hosted ArcGIS Online hosted Notebook as a lightweight scheduler for complete automation. While I'm usually a no-code guy, there is a little Python involved, but you can relax, I'm sharing the code patterns in the blog download. My blog scenario is at the high end in its usage of ArcPy but only because ArcGIS currently lacks a geoprocessing tool that rebuilds a portal locator (note to self: get one coming), much of the time you might build your web tool using Data Interoperability called within ModelBuilder, so without ArcPy code. So with some ArcPy and using the ArcGIS Python API in a Notebook we can put it all together. Let's get started! My source dataset to integrate is the city and county of San Francisco Enterprise Addressing System Addresses with Units. The data changes daily and I want a portal locator refreshed each day before working hours. Here is my Data Interoperability workspace authored in Pro and which does the job interactively: Refresh Portal Locator The ETL tool reads a CSV file at a URL provided by Socrata, writes the features to a scratch geodatabase, performs change detection between the latest data and its previous state (maintained in a portal feature service) then writes the changes both to the feature service and an Edits feature class in the scratch geodatabase. It also emails me a handy summary of new or changed addresses. The tool is in the blog download and requires Pro 3.0+ and Data Interoperability. Notes on the scratch environment: Since publishing this article I have noticed my web tool occasionally failing at run time due to the scratch geodatabase becoming corrupt, the fix was to refactor how the scratch environment is used, with a unique name generated at run time. It is not possible to set the file geodatabase writer to automatically overwrite the target geodatabase because it is locked at run time. But how do changes get pushed to the portal locator by the ETL tool, you ask? By using a shutdown script that calls ArcPy in the geoprocessing scratch environment! Data Interoperability ETL tools are geoprocessing tools and the scratch geodatabase in this case is found by this scripted parameter: import arcpy
scratchGDB = arcpy.CreateScratchName("xx",".gdb","Workspace",arcpy.env.scratchFolder)
arcpy.AddMessage('Using scratch GDB {}'.format(scratchGDB))
return scratchGDB The scratch folder is found similarly: import arcpy
scratchFolder = str(arcpy.env.scratchFolder)
arcpy.AddMessage('Using scratch folder {}'.format(scratchFolder))
return scratchFolder When shared as a web tool the log shows these messages: Using scratch GDB C:\Users\arcgis\AppData\Local\Temp\scratch\xx0.gdb Using scratch folder C:\Users\arcgis\AppData\Local\Temp\scratch Now the powerful bit, using ArcPy in a shutdown script to recreate the locator service. The locator creation code originated as a copy from a history item after a Create Locator tool run, the rest just from reading the help. # If there are edits then recreate the portal locator
import arcpy
import datetime
import fme
import os
import pytz
scratchGDB = fme.macroValues['ScratchGDB']
scratchFolder = fme.macroValues['ScratchFolder']
portalURL = fme.macroValues['PortalURL']
serverURL = fme.macroValues['ServerURL']
portalUser = fme.macroValues['PortalUser']
portalPassword = fme.macroValues['PortalPassword']
arcpy.env.workspace = scratchGDB
arcpy.env.overwriteOutput = True
if arcpy.Exists('Edits') and arcpy.Exists('AddressesWithUnits'):
# Create or recreate the locator if there are data changes
arcpy.geocoding.CreateLocator("USA",
r"{}\AddressesWithUnits PointAddress".format(scratchGDB),
"'PointAddress.HOUSE_NUMBER AddressesWithUnits.Full_Address_Number';"+
"'PointAddress.STREET_NAME AddressesWithUnits.Street_Name';"+
"'PointAddress.STREET_SUFFIX_TYPE AddressesWithUnits.Street_Type';"+
"'PointAddress.SUB_ADDRESS_UNIT AddressesWithUnits.Unit_Number';"+
"'PointAddress.NEIGHBORHOOD AddressesWithUnits.Neighborhood';"+
"'PointAddress.CITY AddressesWithUnits.City';"+
"'PointAddress.SUBREGION AddressesWithUnits.County';"+
"'PointAddress.REGION AddressesWithUnits.Region';"+
"'PointAddress.POSTAL AddressesWithUnits.ZIP_Code';"+
"'PointAddress.COUNTRY AddressesWithUnits.Country';",
r"{}\SanFrancisco".format(scratchFolder),
"ENG",None,None,None,"GLOBAL_HIGH")
# Create the SD draft and SD
locator_path = os.path.join(scratchFolder,'SanFrancisco')
sddraft_file = os.path.join(scratchFolder,'SanFrancisco.sddraft')
sd_file = os.path.join(scratchFolder,'SanFrancisco.sd')
service_name = 'SanFrancisco'
pst = pytz.timezone('US/Pacific')
sfNow = datetime.datetime.now(pst)
sfNowStr = sfNow.strftime('%Y:%m:%d %H:%M:%S')
summary = 'Point Address With Units locator for the City of San Francisco updated at {} PST'.format(sfNowStr)
summary += '\nBuilt from the Socrata data source: https://data.sfgov.org/Geographic-Locations-and-Boundaries/Addresses-with-Units-Enterprise-Addressing-System/ramy-di5m'
tags = 'San Francisco,EAS'
analyze_messages = arcpy.CreateGeocodeSDDraft(locator_path,
sddraft_file,
service_name,
copy_data_to_server = True,
summary = summary,
tags = tags,
max_result_size=50,
max_batch_size=1000,
suggested_batch_size=150,
overwrite_existing_service=True)
# Upload the locator
if analyze_messages['errors'] == {}:
arcpy.SignInToPortal(portalURL,portalUser,portalPassword)
arcpy.server.StageService(sddraft_file,sd_file)
arcpy.server.UploadServiceDefinition(sd_file,serverURL)
# Clean up the scratch GDB
try:
arcpy.management.Delete(scratchGDB)
except:
pass So that's the ETL tool that I published as a web tool. Before moving on to how we orchestrate the web tool I have a couple of tips on authoring and publication. The ETL tool is parameterless as far as the geoprocessing environment is concerned, because I set all its inputs to not be published. This works around a behavior with an input CSV file that is at a URL: ArcGIS Enterprise cannot register a URL as a data store or copy the data to the server at run time so we just hide it by not publishing the parameter. The ETL tool uses a couple of web connections to access the portal and authenticate to an email service. You get these onto the server by copying them out as XML files from the Tools>Options>Web Connections menu, copying them to the server, logging onto the server as the arcgis service owner, starting Workbench from fmeworkbench.exe and importing them from the same interface. Make sure to re-authenticate each connection after importing them. Because we're using the scratch environment on the server, which is shared between processes (this behavior may change), the web tool's geoprocessing service is set to run in 1 instance maximum, this prevents us from accidentally getting contention between instances writing to the scratch file geodatabase (multiple jobs will queue). If you need to publish multiple web tools that write into the server environment then refactor them to use temporary workspaces. At this point we have a web tool that does what we want, now we need to orchestrate it on a schedule. For this we use a hosted Notebook in ArcGIS Online. This is the first notebook I ever authored (my no-code background shows) so if I can do it you can too. The web tool runs on my portal but my notebook runs in Online. That is no problem for the notebook, the web tool isn't executing in Online, under the covers the notebook is just authenticating to the portal and sending a submitJob REST request. Online notebooks know how to connect to portal GIS instances. Here is the single-cell notebook (Standard runtime): Online hosted Notebook How hard is that? Crazy simple. The hardest part was getting a certificate from the IT folks that the portal would accept when hearing from ArcGIS Online (my colleague Renato S. gets the credit for that). All I did was read the help and replace help examples with my paths and names. To find the function name to use after importing the web toolbox I just put the code into IDLE so intellisense could give me the methods. Then while editing the notebook I went to the Tasks editor and set up my notebook to run weekdays at 6am. Note that the task editor time picker uses local time values from your browser client. Now I can let the system(s) run and each day my organization has an up-to-date locator maintained in a portal! Locator item details plus today's address changes email
... View more
08-16-2022
06:47 AM
|
1
|
0
|
1920
|
|
IDEA
|
Hi, I'm not suggesting this solves the problem generically but the Data Interoperability extension supports this function if you configure it, plus other forms of messaging.
... View more
08-08-2022
08:53 AM
|
0
|
0
|
979
|
|
BLOG
|
@EthanChen Yes you can do this. On your authoring machine any user-defined or FME Hub downloaded custom transformers are stored per-user at these locations as fmx files: User defined: C:\Users\<username>\Documents\FME\Transformers Downloaded: C:\Users\<username>\AppData\Roaming\Safe Software\FME\FME Store\Transformers To make linked transformers available to web tools on Enterprise simply copy the desired fmx files from your authoring machine into the same location for the arcgis service owner user on the Enterprise machine. You could also embed the custom transformer(s) before service publication, in which case it will travel with the ETL tool. While we're talking about it, you can also move any required credentials between machines by exporting them as XML files and importing in Workbench opened by the arcgis service owner. You will need to navigate to the fmeworkbench.exe file on your server to open it. I create a desktop shortcut to make this easy. It is possible your service owner account has a different name, your admin will know. Alternatively to the above ways to get credentials provisioned to Data Interoperability for ArcGIS Enterprise, in the FME Options/Default Paths interface, edit the Connection Storage paths searched to be a share that both Pro and Enterprise can see.
... View more
07-27-2022
06:54 AM
|
1
|
0
|
10127
|
|
BLOG
|
The Big Picture The ArcGIS Data Interoperability team at Esri get this question a lot, especially at events like the Esri International User Conference. This article will equip you with some background and details we find most helpful in purchasing and usage contexts. ArcGIS Data Interoperability powering ArcGIS Pro In the above screen capture you can see Data Interoperability in a few places; the Analysis ribbon provides access to the Workbench, Data Inspector and Quick Translator apps (also available in the ArcGIS program group start menu), the project has custom Spatial ETL tools defined in a project toolbox, and ModelBuilder is calling a Spatial ETL tool. Not shown - the Quick Import and Quick Export system tools delivered with the extension in the Data Interoperability Tools system toolbox - these use default settings to translate between any non-raster format and file geodatabase (or other format already understood by ArcGIS) in either direction - very convenient for quick inspection or quick sharing of data. Now, some details. Firstly, ArcGIS Data interoperability is built using FME technology. Originally, Data Interoperability was conceived as a way for ArcGIS users to take advantage of Safe Software's industry-leading format support, and FME's no-code approach to handling geospatial data at row level granularity, plus of course the transformational ability of FME technology when working with attributes and geometry. The technology was originally exposed as browsable connections to data and as geoprocessing tools in ArcMap and geoprocessing services in ArcGIS Server. None of that has gone away, but much new has been added, and how it is exposed has changed. If the ancient Greeks had invented computers and studied ETL as a discipline, Heraclitus might have said something like "You cannot step twice into the same data flow". Let's cover format support. Data Interoperability is built from FME and therefore uses an FME 'engine' of a particular release, just like the FME product itself. At writing (December 12th, 2025) the engine in Data Interoperability is 2025.1.2 and in the FME product is 2025.2.1. We build Data Interoperability with the latest engine we can but FME has a different release cycle and can get ahead. If you're interested in what the differences between engines are you can go to this page and click the link for the FME Form change log. How all this relates to formats is that format support is bound to either an engine or an FME Hub package version, and FME packages themselves have minimum engine (and build) requirements. Data Interoperability gets every format we can obtain, so is comparable to FME Form; here are all the formats that are in FME Form but not in Data Interoperability for ArcGIS Pro 3.6. The Format Delta FME Formats Not in Data Interoperability @ Pro 3.6 APT Aircom ENTERPRISE Map Data/ASSET Data Autodesk AutoCAD Map 3D Object Data Autodesk AutoCAD RealDWG DWG/DXF Bentley Map XFM Design (V8) Bentley Pointools POD Bentley i-model Interchange Format CARIS NTX CARIS Spatial Archive (CSAR) Point Cloud CARIS Spatial Archive (CSAR) Raster DES EDIGéO ER Mapper ECW FalconView File GeoConcept Map Graphic Technologies, Inc. (GTI) GTViewer Intergraph FRAMME Standard Exchange Format (SEF) JPEG 2000 (GeoJP2/GMLJP2)* Marconi PlaNet Metria AutoKa Transfer File (FF) Microsoft MapPoint Web XML Northrop Grumman C2PC Magic (Tech Preview) Precisely MapInfo Extended TAB* Precisely MapInfo TAB (EFAL- Tech Preview)* Precisely MapInfo TAB (MAPINFO)* Precisely Multi-Resolution Raster (MRR) (GDAL) Smallworld 4/5 Tableau Hyper Tele Atlas MultiNet Interchange format VoxelGeo OpenInventor (VOIV) * Note that JPEG2000 data is supported with the OpenJPEG2000 reader/writer and MapInfo TAB is supported with the MITAB reader/writer. Let's cover transformer support. Below is a list of all transformers in FME Form 2025.2.1 that are not in Data Interoperability at Pro 3.6. They are 3rd party transformers available for purchase and use in FME - and hence out of scope for Data Interoperability. The Transformer Delta Transformers in FME 2025.0 not in Data Interoperability @ Pro 3.6 Curvefitter GtransAttributeReprojector GtransReprojector ReframeReprojector Other Considerations Why would you buy FME or Data Interoperability over the other? If the above formats or transformers are critical to your work then you will need to buy FME. Otherwise there are other criteria you may balance. ArcGIS Data Interoperability is sold and supported by Esri. FME is sold through Safe Software's channel. Data Interoperability extends ArcGIS Pro and ArcGIS Enterprise; in the case of Pro in the Analysis ribbon and as geoprocessing tools, in the case of Enterprise as geoprocessing services or web tools (although you can use the Workbench application on an Enterprise machine to do things like import credentials, or even process data). It is intended that FME Form and Data Interoperability can be used 'together' in terms of sharing Workbench workspace files but if the products are at different FME engine releases incompatibilities, while rare, can occur and are not bugs. For example, transformers may fail to 'arrive' in Data Interoperability when opening a workspace authored with FME at a later release because it has a newer version of a transformer. In this situation re-insert the missing transformer(s) using Data Interoperability with parameter values taken from the FME original workspace. If this is a common occurrence at your office it can be useful to keep a workspace with any old releases of transformers on your FME machine to copy/paste into workspaces destined for Data Interoperability. FME Form uses ArcGIS Pro software for Esri binary format support such as geodatabase (enterprise, file, mobile) and Knowledge Graph. An exception is the API-based file geodatabase reader/writer that does not need Pro. Pro can be licensed in any of the usual modes. Enable offline use if you will be using standalone Python scripts that call an ETL tool. FME Flow must use ArcGIS Enterprise software for any Esri binary format support such as file, mobile or enterprise geodatabase (with the above API-based format exception) or Knowledge Graph. ArcGIS Server Basic is sufficient for enterprise, file and mobile geodatabase support. ArcGIS Enterprise must be licensed for all cores on machines running FME engines that call ArcGIS executables. ArcGIS Pro is licensed as a single-user app and may not be used on an FME Flow machine to enable Esri formats. However, FME Flow can write to an enterprise geodatabase without a local ArcGIS Enterprise installation if the the target feature class or table is published by reference from a registered data store and FME Flow uses the Esri ArcGIS Portal Feature Service reader/writer with the resulting feature service. This approach uses REST calls and not any Esri executable. Please see Safe's article here, specifically the topic "Connecting to Esri Applications from FME Flow without a license." Can you trade in Data Interoperability for a copy of FME or vice versa? There is no formal program for any sort of cross grade. Esri and Safe have tried to make the products interoperable and many sites have both products. Does Data Interoperability have any FME Flow capability and does FME have any ArcGIS Enterprise capability? Data Interoperability can be used to publish workspaces to FME Flow. There is a dependency that a standalone workspace (fmw file) be used and that the workspace cannot have been used as the source for an ArcGIS Pro ETL tool. Support for the workflow is from Safe Software. FME has no ability to publish web tools to ArcGIS Enterprise but Data Interoperability can embed an FME workspace file (subject to fixing any incompatibility issues) into an ETL tool and publish it as a web tool. Does Data Interoperability have the AI Assist features of FME Form? FME Form 2025.1+ includes AI Assist, which supports general workflow help, including using workspace context, and within various dialogs that help with generating Python code, regular expressions and SQL statements. This feature requires the user be logged in to Safe Software's FME Account identity platform. See also this article. As ArcGIS users may not have FME accounts, the AI Assist features in FME Form are not currently exposed in ArcGIS Data Interoperability. Please note however that the various AI transformers available in FME Hub, such as AmazonBedrockConnector, OpenAIConnector, OllamaConnector etc. do work in Data Interoperability. We hope the above gives you confidence that Esri and Safe have anticipated customer use of our products in a cooperative manner and with the minimum of friction. If you have further technical questions please post them in this forum or for business-related questions though your normal account representative. To obtain a free trial of Data Interoperability extension go to this page.
... View more
07-26-2022
01:25 PM
|
8
|
6
|
15421
|
|
IDEA
|
I really hope there is a better way to address the usability of null values than supporting empty strings in domains. When you have non-printing data in your fields that are not nulls you get bad behavior with attribute queries, statistical analyses etc.
... View more
07-25-2022
11:22 AM
|
0
|
0
|
846
|
|
BLOG
|
Hi Everybody If you made it to the 2022 Esri International user Conference and attended any of the Data Interoperability focused sessions then you may remember some of the material but perhaps not all, and I didn't share the workspaces and data at the time, so here they are! https://pm.maps.arcgis.com/home/item.html?id=1ce84fa8ef06456fb77016e4ac77f69a For example the earthquake feed map: Data Interoperability In Action The package includes attachments, note the Word doc (DemoDescriptions.docx) with some brief notes on each demo, the workspaces files (*.fmw) used and some sample CSV, GeoJSON and zipped Shapefile data used in one demo. You'll need Pro 3.0 and Data Interoperability extension to use the package. See you next year!
... View more
07-18-2022
01:45 PM
|
1
|
0
|
910
|
|
IDEA
|
Ryan if you have Data Interoperability extension you can refresh your target layer with a changes-only data set, let me know if that if of interest.
... View more
07-18-2022
08:24 AM
|
0
|
0
|
2120
|
|
POST
|
If you open a support call with the central issue asking what permissions are needed for SDE stored procedures in SQL Server you will get the response you need. FME is a red herring as its only passing the statements through but it will deter Esri owning the issue. Server team at Safe are a good resource too.
... View more
06-02-2022
07:09 AM
|
0
|
1
|
1411
|
|
BLOG
|
I would have to check with the Enterprise team but it does ring a bell that bi-directional collaboration is a recent feature. You can effectively 'sync' any combination of EGDB, Portal and AGOL datasets in both directions but you would have to figure out the business logic as to which system owns the desired feature state. For example field work might need to be ingested to the EGDB from Portal or AGOL feature services but separately some features added to the EGDB might need to be pushed to the feature services. This logic might be driven by created_date and edited_date, or key field comparisons for new features.
... View more
05-24-2022
11:06 AM
|
0
|
0
|
5586
|
|
BLOG
|
@AJ_devaccount Yes you can do this, scheduling from a Pro client is simplest but it can be executed on an Enterprise machine that has Data Interoperability installed if you need high availability. However, are you sure the core distributed collaboration does not meet your needs? Let's work through that before committing to the ETL approach.
... View more
05-24-2022
10:27 AM
|
1
|
0
|
5602
|
|
POST
|
I missed that you are working in ArcMap; the Create Fishnet tool is in the Data Management > Sampling toolset in the tool catalog tree:
... View more
05-23-2022
01:42 PM
|
0
|
0
|
1421
|
|
POST
|
It is here: https://pro.arcgis.com/en/pro-app/latest/tool-reference/data-management/create-fishnet.htm You can also create an oriented fishnet: https://pm.maps.arcgis.com/home/item.html?id=9398bd2232cb4c8490b0b05015364d28
... View more
05-23-2022
01:11 PM
|
0
|
2
|
1448
|
|
IDEA
|
We do support table naming in mixed case unless the tables were created with delimited names, like "Type", people sometimes do this to protect reserved words for example. Is this your situation? Delimited names are not supported, however Data Inteoperability extension can read tables with delimited names.
... View more
05-13-2022
06:51 AM
|
0
|
0
|
1125
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 11-21-2025 05:34 AM | |
| 1 | 10-06-2025 05:36 AM | |
| 1 | 11-03-2025 05:14 AM | |
| 3 | 11-04-2025 08:41 AM | |
| 1 | 10-23-2025 01:24 PM |
| Online Status |
Offline
|
| Date Last Visited |
a week ago
|