R&H usage

1145
34
06-11-2019 05:59 AM
Occasional Contributor III

I know Nathan has stated the Lrs_Edit_Log is not meant for human consumption and I don’t disagree, but one can still find what I believe is useful information, especially if you compare years. Here is the type/number of edits I calculated for NCDOT for 2018 using 10.5.1:

TransactionDate > '1/1/2018' AND TransactionDate < '1/1/2019' AND ActivityType = X
Edit log Code/R&H Route action name/number of records
1 Create Route 4684
3 Reverse Route 140
4 Retire Route 1260
5 Extend Route 1070
6 Reassign Route 1020
7 Realign Route 349
12 Cartographic Realignment 19139

What are your stats?

34 Replies
New Contributor III

I'll take any opportunity I get to play with Pandas!

Those stats are very impressive. You guys are churning out the transactions! Hopefully once we get our implementation solidified, we'll see these numbers grow.

Here's what I found for NYS:

                            COUNT
ACTIVITY
01 - Create route 382
02 - Calibrate route 3204
03 - Reverse route 211
04 - Retire route 1169
05 - Extend route 494
06 - Reassign route 1272
07 - Realign route 806
12 - Carto realign/GP Tool 1404
13 - Load route 46

>>> sum(log_df_by_activity['COUNT'])
8988

There was a lot of route loading that went on in 2018. I would need to parse out the ROUTE_IDs from the log table's blob field to get a true indication of how many routes were loaded into the system, and I don't think the juice is worth the squeeze.

Here's how I came up with the table:

import datetime
import os

import arcpy
import numpy as np
import pandas as pd

arcpy.env.workspace = r'Database Connections\prod_elrs_elrs_ad_lockroot.sde'

log_table = arcpy.ListTables('*Edit_Log')[0]

where_clause = (
"TRANSACTIONDATE >= '{}' ".format(datetime.datetime(2018, 1, 1)) +
"AND TRANSACTIONDATE < '{}'".format(datetime.datetime(2019, 1, 1))
)
table_array = arcpy.da.TableToNumPyArray(
log_table,
['TRANSACTIONDATE', 'ROUTEID', 'ACTIVITYTYPE', 'USERNAME'],
where_clause=where_clause
)
log_df = pd.DataFrame(table_array)

condition_list = [
log_df.ACTIVITYTYPE == 1, log_df.ACTIVITYTYPE == 2,
log_df.ACTIVITYTYPE == 3, log_df.ACTIVITYTYPE == 4,
log_df.ACTIVITYTYPE == 5, log_df.ACTIVITYTYPE == 6,
log_df.ACTIVITYTYPE == 7, log_df.ACTIVITYTYPE == 12,
log_df.ACTIVITYTYPE == 13,
]
choice_list = [
'Create route', 'Calibrate route',
'Reverse route', 'Retire route',
'Extend route', 'Reassign route',
'Realign route', 'Carto realign/GP Tool',
'Load route',
]

log_df['ACTIVITYDESCRIPTION'] = np.select(condition_list, choice_list)

log_df['ACTIVITY'] = (
log_df['ACTIVITYTYPE'].apply(lambda x: '{:02d}'.format(x)) +
' - ' +
log_df['ACTIVITYDESCRIPTION']
)

log_df_by_activity = log_df.groupby(['ACTIVITY']).count()
log_df_by_activity = log_df_by_activity[['ROUTEID']]
log_df_by_activity.columns = ['COUNT']
print(log_df_by_activity)

log_df_by_user = log_df.groupby(['USERNAME']).count()
New Contributor II

Andrew,

This is good stuff and thanks for sharing the code.  I know what I would like is some way to do something similar to make a readable audit trail of edits.  But to do that I would need a programmatic way to do the "Show XML" right-click to the edit log.  Have you (or anybody else) been able to do this?

New Contributor III

Phil,

I've hacked something together before.

I'm not very experienced with XML and I'm stuck with the base Python install at work, so there's probably a better way. I particularly don't like writing the XML to disk then reading it in with the parser all in a for loop.

I'm sure the library supports strings as input, I just didn't have time to figure it out when I wrote this. I've also heard Beautiful Soup is more intuitive, but I can't pip install it at work.

Edit: It's funny how we miss the obvious stuff when we're under pressure. It looks like you can avoid writing the XML file to disk by simply using the ET.fromstring() method. I've updated the code block. 

import os
from pprint import pprint
import xml.etree.ElementTree as ET

import arcpy


arcpy.env.workspace = r'Database Connections\prod_elrs_elrs_ad_lockroot.sde'
log_table = arcpy.ListTables('*Edit_Log')[0]

where_clause = 'OBJECTID = 82595'
fields = ['EDITDATA', 'TRANSACTIONDATE']
route_ids = []
with arcpy.da.SearchCursor(log_table, fields, where_clause=where_clause) as cursor:
for row in cursor:
xml = row[0].tobytes()
load_date = row[1]

root = ET.fromstring(xml)
for child in root:
for grandchild in child:
route_id = grandchild.text
if route_id:
route_ids.append([int(route_id), load_date])

pprint(route_ids)
[[100217011, datetime.datetime(2019, 5, 20, 10, 37, 34)],
[100428081, datetime.datetime(2019, 5, 20, 10, 37, 34)],
[100294021, datetime.datetime(2019, 5, 20, 10, 37, 34)],
[100168012, datetime.datetime(2019, 5, 20, 10, 37, 34)]]
New Contributor II

Andrew, good stuff.  The edit log adds an extra difficulty over just working with XML.  It stores the XML as a blob in the EditData field.  That is the part that has me stuck.  I don't know how to unlock that blob data.

Reply
0 Kudos
New Contributor III

Phil,

It definitely adds some complexity, but I can see why Esri made this decision. Our log table would be nothing but load route activities at this point. If you come up with a more efficient way to parse the xml once it's extracted in the cursor, I'd love to see it.

Reply
0 Kudos
Occasional Contributor III

Nice Andrew.

Yes, we a have a few transactions but with up to 11 editors at one point I would hope so.  

I had dropped out calibration point because our numbers would look over inflated as we typically have more then the traditional 2 points per route from our data load. For 2018 our Calibrate Route was 32463 records.

We don't have any loaded routes because we did that work in 2016 before we started going live when we where fixing branched routes etc. Our 2017 number are slightly below 2018 totals but 2019 seems on pace with 2018 numbers. As with anything, I not 100% sure what numbers really mean other then we have touched a lot of data and if we want to put metrics behind them in the future we have something that can be comparable through the years. 

Occasional Contributor III

With a new year, I thought I would report on NCDOT's 2019 stats.

TransactionDate > '1/1/2019' AND TransactionDate < '1/1/2020' AND ActivityType = X

Edit log Code/R&H Route action name/number of records

1 Create Route 7476

3 Reverse Route 127

4 Retire Route 4563

5 Extend Route 1992

6 Reassign Route 1058

7 Realign Route 795

12 Cartographic Realignment 32154

Please share your stats.

New Contributor III

Nice! You've been busy. Here's where NYSDOT came in for the year:

                            COUNT
ACTIVITY
01 - Create route 1425
02 - Calibrate route 964
03 - Reverse route 193
04 - Retire route 843
05 - Extend route 555
06 - Reassign route 196
07 - Realign route 626
12 - Carto realign/GP Tool 1512
13 - Load route 7

ACTIVITY for 2019-2020, 1471 total

1 - Create Route: 9

2- Calibrate Route: 1203

3 - Reverse Route: 0

4 - Retire: 86

5 - Extend: 11

6 - Reassign: 35

7- Realign: 17

12 - Carto Realign: 107

13 - Load: 3