POST
|
For one of my users in 10.4.1, it is a public land survey layer of just over 2 million records served from a versioned Oracle geodatabase. He was using the selection tool to graphically select anywhere from 6 to 70+ records, opening the attribute table and then highlighting a subset to delete. Then he would close the table... crash. If he left the table open, no problems repeating that workflow over & over for the rest of the session. I don't have 10.5.1 up yet to test, unfortunately.
... View more
09-26-2017
01:54 PM
|
0
|
0
|
1502
|
POST
|
Much faster! If you have Data Reviewer installed as part of the Mapping & Charting solution, search for that patch instead (not the one I linked above), it contains the same fix. A couple of sample results using two test versions: 10.4.1 = 49min 10.4.1 with patch =2min 10sec 10.3.1 = 21min 30sec 10.4.1 = 1hr 4min 33sec 10.4.1 with patch = 3min 52sec
... View more
04-28-2017
11:29 AM
|
1
|
0
|
706
|
POST
|
Michelle - just saw this new patch: Esri Support 10.4.1 Are there any metrics you can share about how much of an improvement this makes?
... View more
03-13-2017
12:44 PM
|
0
|
0
|
316
|
POST
|
New Data Reviewer patch out: Esri Support 10.4.1 ...could be promising? Enhancements and Issues Addressed Desktop and Server TFS97746 Improve performance when running check on Changed Features Only.
... View more
03-13-2017
12:35 PM
|
0
|
0
|
706
|
POST
|
I'm seeing similar using Data Reviewer against versioned SDE... when only evaluating changed features (known to be slow to begin with). I'm using the same database & edit version and same .RBJ file in both 10.3.1 and 10.4.1 tests. I've broken the RBJ into 4 pieces (by feature class) and typically the largest feature class review fails with a .NET error or ArcMap crashes (1 edited record out of 1.7 million total features). The other feature classes are about 1,288 records / 48845 records / 98408 records and they complete in 10.3.1 in about half the time it takes on 10.4.1. We'll probably start a ticket with support on this, but given the shift in ESRI's development emphasis to ArcGIS Pro I'm not optimistic in getting a fix - hopefully its a known issue and resolved in 10.5 I guess.
... View more
02-24-2017
02:27 PM
|
0
|
2
|
706
|
POST
|
We are scripted (part of a larger ArcObjs version management system), so another option might be to run the review on the extent of the changed features, and then filter the results to only report errors on the new & modified records back to the user. Given our workflow and spatial patterns of data updates, I cannot hold the editor responsible for everything within the extent when evaluating the version for pass/fail - too much legacy data that is substandard and will never be updated is intermingled in the spatial extent they would have edited.
... View more
12-30-2016
06:35 AM
|
0
|
0
|
1601
|
POST
|
I still have not found a consistent way to reproduce the problem, but in our case it usually looks like UTM values for our area instead of the decimal degrees that we have as the defined coordinate system - perhaps some problem with users editing the GCS data when their data frame is in UTM. I'm considering rebuilding the data with a new spatial spatial domain/resolution/tolerance instead of the default values but not sure when I could do that as these are heavily edited feature classes. For now I have a ArcPy script that runs nightly to loop through all data owner accounts and describe the feature classes. If the extents are outside of a certain geographic area, I put that feature class in a list that I email to myself and the other SDE Administrator for correction when we arrive in the morning.
... View more
10-24-2016
07:52 AM
|
0
|
0
|
1339
|
POST
|
We've been making out editors run scripted Data Reviewer against their SDE versions prior to posting for over a year now, and have frustrated enough users that we are investigating a process to export the changed records in the version out to File GDB just to run Data Reviewer. Our test version summary: 100 edited records in TEST1_POLY (of 68125 total records) 100 edited records in TEST2_POLY (of 83631) 100 edited records in TEST3_POLY (of 54762) There are about 15 checks in the RBJ for each feature class, most are SQL checks. ~ 30 minutes to run Data Reviewer against the user's version (changed records only) ~ 2 minutes to export the changed data to file GDB and run Data Reviewer against it... and more than half of the 2 min. was to do the export. Needless to say, that's enough evidence that we're going to throw some more resources at this prototype. We do recognize that some checks will not work against the exported file GDB data (such as validating unique values across the entire feature class) and we might need to have two separate RBJ files - one run against the database, another against the exported file GDB data. There will need to be a step to link the file GDB results back to the SDE objectids for the final report, but I can't imagine that will be significant. There are other issues to sort out as well. If anyone has attempted something similar or has a different idea on how to work-around the abysmally slow Data Reviewer in SDE processing, we'd love to hear about it... -Steve
... View more
10-21-2016
08:47 AM
|
0
|
3
|
1601
|
POST
|
I have a feature class in versioned geodatabase that contains a status field that flags the record as "Proposed" or "Completed". The data users want this published as two separate feature classes based on the status field. I am looking at using a 1-way replica with a definition query filter (based on the status field) to separate the data into the two publication feature classes (in file geodatabases, but possibly also non-versioned SDE feature classes). When a user makes an edit to change a record from Proposed to Completed, the subsequent synchronization adds that record to the publication Completed feature class, but it does not remove that record from the publication Proposed feature class - it remains there (with it's new status of Completed)... so now the same record is in both publication feature classes. My initial thought is to sweep through the publication feature classes after synchronizing and delete that record where the status was updated and no longer meets the criteria of the definition query (in the example above, delete the now Completed record that exists in the publication Proposed feature class). Does this sound like a valid way to accomplish this request by the data users, or am I possibly going to break the replica? Early tests seem OK but I've only done a few syncs/deletes.
... View more
07-17-2015
09:00 AM
|
0
|
1
|
2144
|
POST
|
Talked to ESRI Support and my problem was that I was checking versioned SDE data with the CHANGED_FEATURES option... in that case I need to check "Always Run on Full Database" under "Feature or Object Class 2" (my fertilizer table) otherwise if all the user did was make a change to the polygon shape, it would not find a fertilizer table record because that fertilizer record was was not also updated. Here is my current check and it has tested out fine:
... View more
04-03-2015
11:46 AM
|
0
|
1
|
304
|
POST
|
Follow-up: I did have a user report that they did attempt to insert data into a SDE Feature class from a shapefile that did not have a defined projection... and then the extents changed radically. That makes sense... well, the reason the extents changed... not as to why we have people still trying to use shapefiles with no projection definition - oh, the humanity! Easy enough fix for the feature classes using the Property sheet in ArcCatalog or sdelayer -o alter -E for the topology layers (if affected)... but does anyone with 10.3 know how to fix the topology layers extents without sdelayer?
... View more
03-18-2015
02:03 PM
|
0
|
0
|
197
|
POST
|
Thanks Jay - we ended up using ArcMap to make a saved "Session 1" within a master file geodatabase with the default checks disabled (remember to hit Apply!)... When our routine runs, it copies that file geodatabase to the user's workspace on disk, and then runs CreateReviewerSession_Reviewer into that copied GDB with "session_template" value set to "Session 1". The newly created session is the input to ExecuteReviewerBatchJob_Reviewer and runs without the default check, instead using our Invalid Geometry check with our defined Severity setting output Check Title which are keys to our application. -S.
... View more
02-26-2015
10:49 AM
|
0
|
0
|
228
|
POST
|
Perhaps I am exceeding what this check can do. Here is my scenario: I have a polygon feature class showing where chemicals have been sprayed. One of the attributes is the type of chemical - for this example, I am looking for fertilizers. There is a separate table for reporting the type of fertilizer. There is an integer key to establish a relate between the two tables - but there is NOT a geodatabase relationship class... What I want to find with Data Reviewer are polygons where the chemical type is fertilizer, but there is not a related record in the fertilizer table, or if the record in the fertilizer table has Null in the fertilizer type field. Is this possible? I seem to be able to use Table to Table to find records in the polygon feature class that have a matching record in the fertilizer table where they type isn't Null, but am unable to get Data Reviewer to find chemical polygons representing fertilizer spraying that do not have corresponding records in the fertilizer table. And yes, I know about the "Not - find rows that do not match" option! Does anyone have similar example that they have implemented? ArcGIS 10.2.0, ArcSDE 10.1 on Oracle 11.2 Thanks!
... View more
02-24-2015
11:56 AM
|
0
|
3
|
3883
|
POST
|
We are running Data Reviewer batch files within an ArcMap .NET program by calling geoprocessing. It appears that the presence of certain checks (see "Note" at top of ArcGIS Help (10.2, 10.2.1, and 10.2.2) ) causes the default checks to run. We have also written the Invalid Geometry Check into our RBJs (as well as multi-part checks) as SEVERITY = 1 because we use these severity values to classify the results later. Our problem is that Data Reviewer seems to be ignoring our RBJ's Invalid Geometry Check (SEVERITY = 1) and just running the same check as default with a SEVERITY = 5... which is a value our application does not classify. Is there a way our programmer can either get the default check to return a severity of 1, or get our RBJs Invalid Geometry Check to run instead of being bypassed? Otherwise I think we're going to need to read the REVTABLEMAIN for the default check title of "Invalid Geometry Check (FC Name) and SEVERITY = 5 separately and reformat it to match the results & classification of the other checks we wrote.
... View more
02-19-2015
11:24 AM
|
0
|
2
|
3988
|
POST
|
Based on this discussion and another one, I am currently attempting the same synchronize using Desktop 10.1 instead of 10.2. The synchronize is still running on 10.1 which is better than the crash we experienced on 10.2 within 30 seconds so I am hopeful!
... View more
11-10-2014
10:10 AM
|
0
|
1
|
949
|
Title | Kudos | Posted |
---|---|---|
1 | 04-28-2017 11:29 AM | |
3 | 07-03-2013 09:13 AM | |
4 | 07-09-2014 01:59 PM | |
5 | 07-09-2014 01:59 PM |
Online Status |
Offline
|
Date Last Visited |
01-22-2023
04:52 PM
|