|
POST
|
I don't see why not. The Append Tool should be scriptable, but I haven't had time to look into that yet. It's a good idea!
... View more
01-05-2018
01:14 PM
|
0
|
0
|
18560
|
|
POST
|
Thomas, check out the update in my original post. I found a very simple, supported, non-hack way to do this using ArcGIS Pro.
... View more
01-05-2018
10:50 AM
|
2
|
2
|
18560
|
|
POST
|
The data is to be hosted in AGOL eventually, so hacks won't work.
... View more
12-05-2017
10:48 AM
|
0
|
0
|
18559
|
|
POST
|
After further testing, Mitch's solution won't work for us. The GUID field created doesn't autopopulate with a GUID in either ArcMap nor ArcGIS Online once deployed. Unless there is another trick to it, I'll have to stick to my original solution.
... View more
12-05-2017
09:03 AM
|
0
|
2
|
18559
|
|
POST
|
Great answer, thanks! Your solution is better because once the change is made, then I never have to deal with the issue again after changing schemas. Thanks again.
... View more
12-04-2017
04:57 PM
|
0
|
3
|
18559
|
|
POST
|
Just discovered that if you use these exact names in your input table, geocoding fails as well: SHAPE_STArea SHAPE_STLength The reason this is causing geocoding to fail is because the output is a POINT feature class, and when ArcGIS is attempting to generate a new feature class, it gets rejected because POINT feature classes aren't allowed to have those reserved field names. It makes sense, but it seems like the geocoding module should check for those things, like reserved field names, when you provide the input table and so not allow it to be used as an input table (maybe a red circle exclamation point thing ArcGIS GP tools are so fond of using when they anticipate an error with the input provided). Or at the very least, specifically tell the user in the error window that the input table contained field names that aren't allowed in a point feature class, instead of it being some mysterious issue (as this thread can attest to).
... View more
10-30-2017
02:11 PM
|
1
|
3
|
1444
|
|
POST
|
I found this page searching for the same error: "There was an error trying to process this table" I didn't see my resolution here, so I thought I'd share it: If the address input table contains either one of the following field names, Geocoding the table will fail: SHAPE_STArea__ SHAPE_STLength__ Those are 2 underscore ('_' + '_') characters at the end of the field name. It has to be 2 underscores; 1 is no problem. It's a bug I'm going to document with Esri Support for our case. I got those fields because I was trying to create a small stand-alone test table based off a subset of feature attributes by exporting them from a feature attribute table. It took forever to figure out why the bug was occuring with perfect data, and I'm sure it's not a very common occurence, but if you have either one of those fields in your input CSV or stand-alone geodatabase table, Geocoding your table will fail.
... View more
10-25-2017
03:23 PM
|
1
|
4
|
1444
|
|
POST
|
We would like to have a Hosted Feature Service (HFS) that we can have named AGOL users bring into ArcMap and do the "edit locally" and "sync back to server" workflow. However, we also need that HFS to be viewable as read-only to the Public. Is this possible? We've tried to set this up, but the only way to do the ArcMap local editing/sync thing is to make the HFS have the "Enable Editing" option turned on. And when you have Editing turned on with the HFS being available to the Public, then the anonymous Public can edit the data inside the HFS, which is not desireable. If you turn off Enable Editing, then we can't do the local editing/sync workflow. At this point, the only workflow that works is to have the GDB stored locally, permanently, and publish it out/overwrite the existing HFS with "Query" being the only capability checked, when we have updates. It works, but it's a one-way workflow, which we were hoping to avoid.
... View more
10-10-2017
11:59 AM
|
0
|
1
|
922
|
|
POST
|
The spirit of this post is to gather other people's solutions, discuss ways of improving the suggested solution, and track future ArcGIS capabilities as they evolve for this problem. This all started because we had existing data in a GDB that contained a Relationship Class (RC) between a Feature Class (FC) and a Table (TBL) using the FC's GlobalID field. We wanted to move that data into a new GDB that had a new schema (changes in domains, fields, etc.). The problem is that when you Append the old data into the new schema, the GlobalID in the new schema is different than the records from the old data. This breaks the relationship between the FC and the related TBL. UPDATE 2: I've got some bad news and I've got some good news. And some ugly details. But I also have steps/options. The BAD news The "New Solution" from my first update no longer works in Pro 3.x, as so kindly brought to our attention by @MichaelMannion a few days ago in this reply. Thanks Mike! Basically, ArcGIS Pro no longer allows you to change the datatype for step #8 in the solution below. This breaks the solution I originally provided. The GOOD news ArcGIS Pro 3.x now respects the "Preserve GlobalID" environment option in the Append tool, but with very specific parameters. It works if the target database is either Enterprise Geodatabases and Mobile Geodatabases, but not File Geodatabases. The UGLY details In order for the Append tool to actually preserve the GlobalID from your source to your target, you still need to check the "Preserve GlobalID" environment option in the Append tool. You can only get it to work with Enterprise Geodatabases (EGDB) and Mobile Geodatabases (MGDB) as your targets, as no longer works with File Geodatabases (FGDB). Here's why. The target GlobalID field must have and index that is configured as "unique". You can't make a GlobalID field that has a unique index in a FGDB. Unique and ascending indexes are not supported for shapefiles or file geodatabases. These parameters are ignored when the tool is executed on a shapefile or file geodatabase data. A 'unique' index can only happen if: the table/feature class is in an EGDB /MGDB you add the GlobalID to the table/feature class using ArcGIS Pro (3.x+) alternatively, copy/pasting the empty target table/feature class into an EGDB or MGDB will automatically change the index to 'unique' for GlobalID in the paste target When you follow those two parameters, the index that gets automatically created for the GlobalID field that is has the unique setting. If your target field is a GUID datatype, it must also have a "unique index". You must use add the index in ArcGIS Pro, on an EGDB table/feature class, and make sure the "unique" checkbox is selected before you run the Add Attribute Index tool. The STEPS for Enterprise Geodatabases/Mobile Geodatabases Below I only refer to EGDB, but if you're using a MGDB the same steps apply. Make sure your target table/feature class is in an EGDB If you have an empty table/feature class sitting in a FGDB but the indexes are not "unique", just copy them over to the Enterprise Geodatabase using ArcGIS Pro (3.x+) Make sure your target field in the table/feature class has a "unique index" GlobalID If you already have a GlobalID field, but the index is not unique you need to replace your table/fc with a new one Export it to a local FGDB, but make sure to remove GlobalID following these steps. Delete the original target from the EGDB Replace the original target with a copy of your table/fc that has no GlobalID In ArcGIS Pro (3.x+) add GlobalID's to the target table/fc (right-click>Manage>check the box for "Global IDs") Confirm that the index for the GlobalID field is in fact "unique" GUID If you already have a GUID, but the index is not unique, you need to delete the index Then you need to use the Add Attribute Index tool to create an index for the GUID, and make sure you check the box to have it be "unique" Confirm that the index for the GUID field is in fact "unique" Now your EGDB target table/fc fields are ready! Open the Append tool Under Environment, check the "Preserve GlobalID" box Under Parameters, append as usual Once your data is in the updated schema, you can copy/paste it over to a File Geodatabase if you need to (for a deliverable or something). The STEPS for File Geodatabases All hope is not lost for users who must be restricted to File Geodatabase only. (But really, there is no reason to be afraid of the Mobile Geodatabase as a in between step, it works well!). Thanks to user @DirtDogRoj in his excellently documented reply you can follow clearly outlined steps to achieve the same result. I think you could put it into Model Builder even as all the steps very systematic. You might like this method better just because! 😉 Pay no attention 2: UPDATE 1: I have a much, MUCH, better solution that I discovered and it doesn't appear to be documented anywhere. All the text below that has strikethrough you shouldn't pay attention to, as it was the old, convoluted solution. New Solution: Have your old data ready to load (your "loading" gdb - this is your source data) Have your empty geodatabase with the new schema. (your "new" gdb. It's okay that it has the "GlobalID" field, there is a new way to populate it with the GlobalID values from your loading geodatabase in ArcGIS Pro.) Open ArcGIS Pro Make sure your two geodatabases are added to the project (the "loading" gdb and the "new" gdb) Run the Append Tool Use your loading gdb feature class as the source, and your new gdb feature class as the target For the tool's Environment settings, make sure the "Preserve Global ID field" checkbox is checked In the Append tool's Field Mapping section, under the "Properties" tab, change the target feature class Data Type for the GlobalID field from "GlobalID" ---to----> "GUID". (I know, this seems strange. The data type is set in the target feature class schema as "GlobalID", but if you don't change this to GUID, then the GlobalID values from your source feature class will not migrate over to your new FC. If you change this setting to "GUID", they will magically migrate over to your new FC! I don't believe this is documented. If someone sees this documented anywhere, please share in the comments!) Repeat steps 4-8 for your related table. Pay no attention 1: After doing some searching, I discovered that there is a way to do this using and Enterprise Geodatabase (EGDB) and ArcGIS Pro. I post my basic workflow here on how to preserve the FC's GlobalID values so that when you migrate the data over to the new schema, the GlobalID values stay the same in the new FC. This preserves the relationship between the FC and the TBL in the new schema. Assumptions: You have a FC you need to migrate to a new schema and would like to preserve the GlobalIDs. You have access to an EGDB and ArcGIS Pro Manual Steps: Copy the source FC (Feature Class) to an EGDB (Enterprise Geodatabase) Rename each class to have "_OLD" appended to them Note: You might need to deal with differences in domains at this point if they have the same name but have different contents You don't want your final domains to be appended with "_1". If so, then after you copy them over, you will need to turn off the domains where they are used, and delete them in the EGDB This doesn't affect final product because this is only happening in the source data Prepare your new schema of your target FC to have a "GlobalID" that can be preserved: Take a copy of the FC empty schema Use X-Ray in ArcCatalog to remove the GlobalID field in the FC Create a new GDB using this new FC design Create a new "GlobalID" field manually (don't use the GP Toolbox) in the FC Use "GUID" datatype Do this in the FCL for the Global ID's that need to be preserved - this is normally only necessary on the FC where there is a oneway relationship Copy the FC to the EGDB In the newly copied FC in the EGDB… Create a new index for this new "GlobalID" field Make sure that it has "Unique" box checked, and "Ascending" too Append records from the OLD FC to the NEW FC using ArcGIS Pro's Append tool Add the FC's to an ArcGIS Pro project Run the Append tool Make sure to have the "Preserve GlobalID" box checked under Environments For "Schema Type", use the "Use the Field Map to reconcile schema differences" option At this point, you can now copy/paste that new FC back to your location of choice, and rebuild the RC so that it connects up with the new TBL. It turns out that the GUID's used in the related table to relate back to the FC are naturally preserved by using the Append tool in ArcCatalog, so performing the workflow above on the related TBL is unnecessary. Even though the TBL's GlobalID (not GUID) values change when moving the data, that's doesn't matter to us because they aren't used to create the relationship. We don't do this often so we aren't going to take efforts to automate it, but I assume that might be possible.
... View more
09-29-2017
10:04 AM
|
30
|
52
|
45971
|
|
POST
|
I just checked, and it does appear to be fixed. My hyperlinks in Open Data are now pointing to the correct license.
... View more
09-28-2017
09:21 AM
|
0
|
0
|
956
|
|
POST
|
Super helpful! Thanks. For those that would like a Python option.... here's a gist of something I cobbled together: Give this script a geodatabase (Personal, File, or SDE connection) and it will remove the "Geoprocssing History" content…
... View more
08-30-2017
03:57 PM
|
2
|
0
|
1085
|
|
POST
|
I got the size from a query Esri support suggested. It went from 60 MB to 25 MB. That means there was 35 MB of GP history in the metadata that got deleted. I don't know why it takes so long to read 35 MB of data for modern software, but that's the way it goes with Esri software. It can do amazing visual gymnastics with 3D data in realtime, but it takes over a minute for it to read 35 MB of text, essentially. Go figure.
... View more
08-15-2017
02:10 PM
|
1
|
2
|
4708
|
|
POST
|
Updated: Discovered the issue: Metadata Geoprocessing History Cleared out the Geoprocessing History from the Metadata for our large SDE geodatabase by running this Python code in ArcMap's Python Console. ##Change this to point to your SDE connection file gdb = r"C:\Users\...\AppData\Roaming\ESRI\Desktop10.5\ArcCatalog\SDE@YOURGDB.sde" ##Change this to match your file structure and location remove_gp = r"C:\Program Files (x86)\ArcGIS\Desktop10.5\Metadata\Stylesheets\gpTools\remove geoprocessing history.xslt" ##Change this loaction to a temp folder to output the XML name_xml = r"C:\temp\cleanup_staging" arcpy.XSLTransform_conversion(gdb, remove_gp, name_xml, "") arcpy.MetadataImporter_conversion(name_xml, gdb) print 'completed' This reduced the size of our GDB by more than 50%! After that, when in ArcGIS Pro, connecting to the GDB went down from 1.25 minutes to just 5 seconds. Fantastic. The bonus is that it sped up connection times to SDE in ArcMap as well. Thanks Esri support!
... View more
08-10-2017
11:18 AM
|
5
|
6
|
4708
|
|
POST
|
Thanks for the recommendation. There have been no upgrades on this machine. It's all fresh installs on a new Win 10 Pro OS. The part to remember is that ArcCatalog works just fine. ArcGIS Pro is slow. This was also the case with my older Windows 8 OS, with ArcGIS Pro 1.4 and ArcGIS Pro 2.0. All things begin equal, ArcGIS Pro is very slow with this large SDE connection and doesn't remember connections. ArcCatalog works like a champ. The network is fast, the database server is fast, everything is fast. ArcGIS Pro is slow with the large SDE connections. This was true on my previous setups and is still true today with a new setup. Plus that Esri expert admitted they had issues with Pro doing buggy stuff with SDE connections. I'll create a support ticket.
... View more
08-04-2017
09:18 AM
|
1
|
0
|
4708
|
| Title | Kudos | Posted |
|---|---|---|
| 3 | 09-15-2025 01:02 PM | |
| 3 | 12-06-2024 08:27 AM | |
| 1 | 03-07-2024 09:45 AM | |
| 3 | 12-04-2024 11:38 AM | |
| 4 | 09-24-2024 12:05 PM |
| Online Status |
Offline
|
| Date Last Visited |
3 weeks ago
|