Export only dominant routes from LRS?

2805
12
Jump to solution
01-28-2019 09:24 AM
AndrewVitale3
New Contributor III

I'm looking to export the entirety of our enterprise LRS out of ArcMap, but with a caveat. When there is a concurrency, I would like only the dominant route present in the exported data.

I've got a workflow that seems to give me what I want, but it's a lot of steps.

Here's what I'm currently using:

  1. Run the Calculate Route Concurrencies GP tool with the appropriate temporal view date
  2. Export all non-retired routes into a separate LRS feature class
  3. Select the dominant routes from the output table, and run the Make Route Event GP tool from the Location Referencing toolbox to create an event of dominant routes
  4. Use the Erase GP tool from the Analysis toolbox to remove all concurrent sections from the LRS feature class
  5. Merge the dominant events with the LRS feature class that's output from Erase
  6. Dissolve on Route ID

Does anyone have a simpler workflow to accomplish this? Do you see any flaws in this current workflow?

Thanks in advance.

Edit (1/30/2018):

I'm including a Python script that implements something similar to Amit Hazra‌'s model builder screenshot. Maybe someone will find it useful. Thanks for the tips, RHUG.

Edit (3/17/2020)

I ended up turning this into an ArcGIS Desktop Geoprocessing Tool as well. It's only been tested on ArcGIS Desktop 10.5.1 with an LRSN Route Network stored in a file geodatabase or SQL Server 2016 geodatabase. If you're using a different RDBMS, the where_clause that applies the Temporal View Date may be different.

GitHub - vitale232/ExportDominantNetwork: Export dominant routes from an Esri R&H ALRS. 

I've also attached a .zip directory containing the tool as of today's date.

import datetime
import os

import arcpy


def make_dominant_network(input_routes, concurrency_table, overwrite_flag=False):
    """
    After creating gaps in the input network where concurrencies exist,
    the geometries must be split to single parts. Then use the m-values
    from the single part geometries to create new routes. This is done to avoid
    non-monotonic routes. Non-monotonicity can be introduced from multipart
    outputs in GP tools, which are likely to reorder the part indices
    """
    dom_table_path  = os.path.join(
        os.path.dirname(input_routes),
        'dominant_table'
    )
    dom_events_path = os.path.join(
        os.path.dirname(input_routes),
        'dominant_event'
    )
    erased_routes_path = os.path.join(
        os.path.dirname(input_routes),
        'milepoint_concurr_gaps'
    )
    erased_routes_singlepart_path = os.path.join(
        os.path.dirname(input_routes),
        'milepoint_concurr_gaps_singlepart'
    )
    merged_routes_path = os.path.join(
        os.path.dirname(input_routes),
        'milepoint_dom_event_merge'
    )
    output_path = os.path.join(
        os.path.dirname(input_routes),
        'milepoint_dominant_network'
    )

    if overwrite_flag and arcpy.Exists(dom_table_path):
        print(' overwrite deleting {}'.format(dom_table_path))
        arcpy.Delete_management(dom_table_path)
    if overwrite_flag and arcpy.Exists(dom_events_path):
        print(' overwrite deleting {}'.format(dom_events_path))
        arcpy.Delete_management(dom_events_path)
    if overwrite_flag and arcpy.Exists(erased_routes_path):
        print(' overwrite deleting {}'.format(erased_routes_path))
        arcpy.Delete_management(erased_routes_path)
    if overwrite_flag and arcpy.Exists(merged_routes_path):
        print(' overwrite deleting {}'.format(merged_routes_path))
        arcpy.Delete_management(merged_routes_path)
    if overwrite_flag and arcpy.Exists(output_path):
        print(' overwrite deleting {}'.format(output_path))
        arcpy.Delete_management(output_path)

    if not overwrite_flag and arcpy.Exists(merged_routes_path):
        field_names = [field.name for field in arcpy.ListFields(merged_routes_path)]
        if all(['m_min' in field_names, 'm_max' in field_names]):
            add_m_fields = False
        else:
            add_m_fields = True

    # Begin geoprocessing logic
    if not arcpy.Exists(dom_table_path):
        print('\nSubsetting concurrency table to only dominant events')
        where_clause = "(DominantFlag = 1) AND (DominantError <> 4)"
        print(' {}'.format(where_clause))
        arcpy.TableToTable_conversion(
            concurrency_table,
            os.path.dirname(dom_table_path),
            os.path.basename(dom_table_path),
            where_clause=where_clause
        )

    if not arcpy.Exists(dom_events_path):
        print('\nCreating event for route dominance')
        line_props = 'RouteId LINE FromMeasure ToMeasure'
        dom_layer = arcpy.MakeRouteEventLayer_lr(
            input_routes, 'ROUTE_ID', dom_table_path, line_props,
            'dom_layer'
        )
        arcpy.CopyFeatures_management(
            dom_layer, dom_events_path
        )
        print(' {}'.format(dom_events_path))

    if not arcpy.Exists(erased_routes_path):
        print('\nCreating network gaps at concurrencies')
        arcpy.Erase_analysis(input_routes, dom_events_path, erased_routes_path)
        print(' {}'.format(erased_routes_path))

    if not arcpy.Exists(erased_routes_singlepart_path):
        print('\nSplitting gapped network to single-part geometries')
        arcpy.MultipartToSinglepart_management(
            erased_routes_path, erased_routes_singlepart_path
        )
        print(' {}'.format(erased_routes_singlepart_path))


    if not arcpy.Exists(merged_routes_path):
        print('\nMerging dominant routes with gapped network')
        field_mapping = map_fields(
            [erased_routes_singlepart_path, 'ROUTE_ID'],
            [dom_events_path, 'RouteId'],
            'ROUTE_ID'
        )
        arcpy.Merge_management(
            [erased_routes_singlepart_path, dom_events_path],
            merged_routes_path,
            field_mapping
        )
        print(' {}'.format(merged_routes_path))
        add_m_fields = True

    if add_m_fields:
        print('\nAdding m-values to the merged routes attribute table')
        merged_field_names = [
            field.name for field in arcpy.ListFields(merged_routes_path)
        ]
        if not 'm_min' in merged_field_names:
            arcpy.AddField_management(
                merged_routes_path,
                'm_min',
                'DOUBLE',
            )
            print(' created field: m_min')
        if not 'm_max' in merged_field_names:
            arcpy.AddField_management(
                merged_routes_path,
                'm_max',
                'DOUBLE',
            )
            print(' created field: m_max')
        update_fields = ['SHAPE@', 'm_min', 'm_max']
        with arcpy.da.UpdateCursor(merged_routes_path, update_fields) as update_cursor:
            for row in update_cursor:
                shape = row[0]
                m_min = shape.extent.MMin
                m_max = shape.extent.MMax
                update_cursor.updateRow([shape, m_min, m_max])
        print(' table update: complete')
    
    if not arcpy.Exists(output_path):
        print('\nCreating final network of only dominant routes')
        arcpy.CreateRoutes_lr(
            merged_routes_path, 'ROUTE_ID', output_path,
            measure_source='TWO_FIELDS',
            from_measure_field='m_min', to_measure_field='m_max',
            build_index='INDEX'
        )
        print(' {}'.format(output_path))


def map_fields(table_field_a, table_field_b, output_name):
    field_mappings = arcpy.FieldMappings()

    field_map = arcpy.FieldMap()
    field_map.addInputField(*table_field_a)
    field_map.addInputField(*table_field_b)
    output_field = field_map.outputField
    output_field.name = output_name
    field_map.outputField = output_field
    field_mappings.addFieldMap(field_map)

    return field_mappings


def main():
    input_routes =      r'D:\Pavement\Input_Data\Milepoint.gdb\Milepoint_20190116'
    concurrency_table = r'D:\Pavement\Input_Data\Milepoint.gdb\route_concurrencies_tvd20190116'
    overwrite_flag =    False

    make_dominant_network(input_routes, concurrency_table, overwrite_flag=overwrite_flag)


if __name__ == '__main__':
    start_time = datetime.datetime.now()
    print('Running script: {0}\nStart time: {1}'.format(
        os.path.abspath(__file__), start_time)
    )

    main()

    end_time = datetime.datetime.now()
    print('\nCompleted at: {}.\nTime to complete: {}'.format(
        end_time, end_time - start_time)
    )
‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍
12 Replies

This post has been a big benefit to Kansas, we have taken the script and concepts discussed here and incorporated this into a script we run nightly to update a publication database from our transactional editing database.  The entire workflow begins with post reconcile and compress from our transactional envionment, and leverages lists in python of events and network feature classes and tables to act on.

I will copy the python here, including the CreatePrimaryNetwork function.  There is one part I am working on that should serve to be able to derive a network, a statewide LRM based on a county LRM shape length, around line 460.  We run this on a server using a task scheduler, its good to test that using PowerShell to make sure Windows will run it, which means it has to be tabbed with spaces instead of actual tabs. WE output primary routes in single part feature as well to facilitate awareness of gapped routes output with this method.  

  

'''
Created on Feb 20, 2019
@author: kyleg
'''

'''
KHUB geodatabase maintenance
Post Reconcile Analyze and Compress
Created on Feb 12, 2019
@author: kyleg
'''
#project publication database into webmerc
webmerc="""PROJCS[
    'WGS_1984_Web_Mercator_Auxiliary_Sphere',
    GEOGCS['GCS_WGS_1984',
    DATUM['D_WGS_1984',
    SPHEROID['WGS_1984',6378137.0,298.257223563]],
    PRIMEM['Greenwich',0.0],
    UNIT['Degree',0.0174532925199433]],
    PROJECTION['Mercator_Auxiliary_Sphere'],
    PARAMETER['False_Easting',0.0],
    PARAMETER['False_Northing',0.0],
    PARAMETER['Central_Meridian',0.0],
    PARAMETER['Standard_Parallel_1',0.0],
    PARAMETER['Auxiliary_Sphere_Type',0.0],
    UNIT['Meter',1.0]]"""
transform_method=r"WGS_1984_(ITRF00)_To_NAD_1983"

in_coor_system="""PROJCS[
    'NAD_83_Kansas_Lambert_Conformal_Conic_Feet',
    GEOGCS['GCS_North_American_1983',
    DATUM['D_North_American_1983',
    SPHEROID['GRS_1980',6378137.0,298.257222101]],
    PRIMEM['Greenwich',0.0],
    UNIT['Degree',0.0174532925199433]],
    PROJECTION['Lambert_Conformal_Conic'],
    PARAMETER['false_easting',1312333.333333333],
    PARAMETER['false_northing',0.0],
    PARAMETER['central_meridian',-98.25],
    PARAMETER['standard_parallel_1',37.5],
    PARAMETER['standard_parallel_2',39.5],
    PARAMETER['scale_factor',1.0],
    PARAMETER['latitude_of_origin',36.0],
    UNIT['Foot_US',0.3048006096012192]]"""

    
    
def Analyze():
    rh = r'C:\Users\planadm\GISDATA\DbConnections\Prod\RH@KHTransKhub.sde'
    sde = r'C:\Users\planadm\GISDATA\DbConnections\Prod\sde@KHTransKHUB.sde'
    from arcpy import AnalyzeDatasets_management
    print("this function analyzes datasets for "+rh + " and " + sde)
    print("analyzing RH and SDE in source")
    AnalyzeDatasets_management(rh, "NO_SYSTEM", "RH.Calibration_Point;RH.Centerline;RH.Centerline_Sequence;RH.Lrs_Edit_Log;RH.Lrs_Event_Behavior;RH.Lrs_Locks;RH.Lrs_Metadata;RH.Redline;", "ANALYZE_BASE", "ANALYZE_DELTA", "ANALYZE_ARCHIVE")
    AnalyzeDatasets_management(sde, "SYSTEM", "", "NO_ANALYZE_BASE", "NO_ANALYZE_DELTA", "NO_ANALYZE_ARCHIVE")

def Compress():
    sde = r'C:\Users\planadm\GISDATA\DbConnections\Prod\sde@KHTransKHUB.sde'
    from arcpy import Compress_management
    print("compressing "+ sde)
    Compress_management(sde)

def PostAndReconcile():
    sde = r'C:\Users\planadm\GISDATA\DbConnections\Prod\sde@KHTransKHUB.sde'
    from arcpy import ReconcileVersions_management
    ReconcileVersions_management(sde, "ALL_VERSIONS", "sde.DEFAULT", "QCADMIN.QC_Lockroot", "LOCK_ACQUIRED", "ABORT_CONFLICTS", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "KEEP_VERSION", "")
    print("lockroot reconciled to default without conflicts:")

def FullPostAndReconcileDefault():
    sde = r'C:\Users\planadm\GISDATA\DbConnections\Prod\sde@KHTransKHUB.sde'
    from arcpy import ReconcileVersions_management, Compress_management, DisconnectUser, AcceptConnections
    print("post and reconcile version with version deletion for " +sde)
    AcceptConnections(sde, False)
    DisconnectUser(sde, "ALL")
    #verListSource = [ver for ver in ListVersions(SDESource) if ver.lower() != 'sde.default' and ver.lower() != 'rh.masterqc']
    print("reconciling MasterQC to Default")
    ReconcileVersions_management(sde, "ALL_VERSIONS", "sde.default", "QCADMIN.QC_Lockroot", "LOCK_ACQUIRED", "ABORT_CONFLICTS", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "KEEP_VERSION")
    Compress_management(sde)
    AcceptConnections(sde, True)

def FullPostAndReconcileSourcetoDefaultCollapseVersions():
    sde = r'C:\Users\planadm\GISDATA\DbConnections\Prod\sde@KHTransKHUB.sde'
    from arcpy import ReconcileVersions_management, Compress_management, DisconnectUser, AcceptConnections, DeleteVersion_management
    print("post and reconcile version with version deletion for " +sde)
    AcceptConnections(sde, False)
    DisconnectUser(sde, "ALL")
    #verListSource = [ver for ver in ListVersions(SDESource) if ver.lower() != 'sde.default' and ver.lower() != 'rh.masterqc']
    for SourceVersion in rh:
        print(SourceVersion)
        ReconcileVersions_management(sde, "ALL_VERSIONS", "QCADMIN.QC_Lockroot", "RH."+SourceVersion, "LOCK_ACQUIRED", "NO_ABORT", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "DELETE_VERSION")
        try:
            DeleteVersion_management(sde, SourceVersion)
        except:
            print(SourceVersion+ " deleted after reconcile/post")
    print("reconciling MasterQC to Default")
    ReconcileVersions_management(sde, "ALL_VERSIONS", "sde.default", "RH.MasterQC", "LOCK_ACQUIRED", "NO_ABORT", "BY_OBJECT", "FAVOR_TARGET_VERSION", "POST", "DELETE_VERSION")
    try:
        DeleteVersion_management(sde, "RH.MasterQC")
    except:
        print("MasterQC deleted after reconcile/post - OR - if this is the only message, then nothing was done because versions do not exist")
    Compress_management(sde)
    AcceptConnections(sde, True)

def RebuildIndexes():
    print('not set up yet')
    
def PublicateNetworkProd2Target(target):
    from arcpy import env
    from datetime import datetime
    from KhubFCList import NetworksList, EventList, IntersectionList, TestEvents
    from arcpy import (da, FeatureClassToFeatureClass_conversion, ChangePrivileges_management, AddMessage, MakeFeatureLayer_management, 
    AddWarning, Project_management, TruncateTable_management, Append_management, 
    ExecuteError, CreateTable_management, AddField_management, Exists, GetMessages, 
    DisconnectUser, Exists, CreateFileGDB_management, Delete_management)
    pub_event_list = EventList  #TestEvents/EventList - use test events for smaller events lists or testing
    pub_network_list = NetworksList
	
    env.overwriteOutput = True

    #Transactional Database connection to default to copy to publication
    RH_Database = r'C:\Users\planadm\GISDATA\DbConnections\Prod\RH@KHTransKhub.sde'

    if target == 'Prod':
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\Prod\geo@KHPub.sde'
        pub_sdedb = r'C:\Users\planadm\GISDATA\DbConnections\Prod\sde@KHPubProd.sde'
    elif target == 'Test':
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\KHTest\geo@KHPubTest.sde'
        pub_sdedb = r'C:\Users\planadm\GISDATA\DbConnections\KHTest\sde@KHPubtest.sde'
    else:
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\KHDev\geo@KHPubDev.sde'    
        pub_sdedb = 'C:\Users\planadm\GISDATA\DbConnections\KHDev\sde@KHPubDev.sde'
        
    wsruncode = datetime.now()
    dbname = "Publish"+str(target)+str(wsruncode.year).zfill(4)+str(wsruncode.month).zfill(2)+str(wsruncode.day).zfill(2)
    ws= env.workspace=r"C:\\temp\\"+dbname+".gdb"
    if Exists(ws):
        try:
            Delete_management(ws)
        except: 
            raise LicenseError
            CreateFileGDB_management(r"C:/temp", dbname, "CURRENT")
    else:
        CreateFileGDB_management(r"C:/temp", dbname, "CURRENT") 
        
        
    print (pub_database)
    pub_database_path = pub_database
    env.workspace = pub_database
    

    status_table = (pub_database+r"/KHUBPub.GEO.Publication_Status")
    if not Exists(status_table):
        CreateTable_management(pub_database, "Publication_Status")
        AddField_management(status_table, "DATE", "DATE", "", "", "", "Date")
        AddField_management(status_table, "LAYER_NAME", "TEXT", "", "", "", "Layer Name")
        AddField_management(status_table, "STATUS", "TEXT", "", "", "", "Status")
        AddField_management(status_table, "MESSAGE", "TEXT", "", "", "", "Message")
    
    status_fields = ["DATE", "LAYER_NAME", "STATUS", "MESSAGE"]
    

    def update_status_table(status_table, data, fields, status_value, message_value):
        now = datetime.now()
        with da.InsertCursor(status_table, fields) as i_cur:  #@UnresolvedImport
            insert_values = (now, data, status_value, message_value)
            i_cur.insertRow(insert_values)
        del i_cur
            
    i = 1

    for event in pub_event_list:  #for testing add the [2] to enumerate the list of events
        try:
            event_name = event.split(".")[-1]
            print(RH_Database+r'/KHUB.RH.'+ event +" " +event_name)
            DisconnectUser(pub_sdedb, "ALL")
            FeatureClassToFeatureClass_conversion(RH_Database+r'/KHUB.RH.'+ event, ws, str(event_name)+"lyr", where_clause="""(LRSFromDate is null or LRSFromDate<=CURRENT_TIMESTAMP) and (LRSToDate is null or LRSToDate>CURRENT_TIMESTAMP)""")
            Project_management((ws+r"/"+str(event_name)+"lyr"), (pub_database_path+r'/KHUBPub.GEO.'+event_name), webmerc, transform_method, in_coor_system, "NO_PRESERVE_SHAPE", "", "NO_VERTICAL")
            ChangePrivileges_management((pub_database_path+r'/KHUBPub.GEO.'+event_name), "readonly", View="GRANT", Edit="")
            AddMessage('{0} {1}: {2} {3} {4}'.format(event_name, "copied over publication database", i, "of", len(EventList)))
            update_status_table(status_table, event_name, status_fields, "Overwrite in Publication Database", None)
                
            i += 1    
        except ExecuteError:
            AddWarning('{0} {1}: {2} {3} {4}'.format(event_name, "produced an error", i, "of", len(EventList)))
            message = GetMessages(2)
            update_status_table(status_table, event_name, status_fields, "Error Occurred", message)
            i += 1
            del event
            continue

    for network in NetworksList:
        try:
            network_name = network.split(".")[-1]
            DisconnectUser(pub_sdedb, "ALL")
            FeatureClassToFeatureClass_conversion(RH_Database+r'/KHUB.RH.'+ network, ws, str(network_name), """(LRSFromDate is null or LRSFromDate<=CURRENT_TIMESTAMP) and (LRSToDate is null or LRSToDate>CURRENT_TIMESTAMP)""")
            Project_management(ws+r"/"+str(network_name), (pub_database_path+r'/KHUBPub.GEO.'+network_name), webmerc, transform_method, in_coor_system, "NO_PRESERVE_SHAPE", "", "NO_VERTICAL")
            ChangePrivileges_management((pub_database_path+r'/KHUBPub.GEO.'+network_name), "readonly", View="GRANT", Edit="")
            AddMessage('{0} {1}'.format(network_name, "overwrote in publication database"))
            update_status_table(status_table, network_name, status_fields, "Overwrite in Publication Database", None)
                
        except ExecuteError:
            AddWarning('{0} {1}'.format(network_name, "produced an error"))
            message = GetMessages(2)
            update_status_table(status_table, network_name, status_fields, "Error Occurred", message)
            del network
            continue
            
    for intersec in IntersectionList:
        #intersections have different lrs date fields than the rest of the Roads and Highways LRS, so get them all to pub (for now)
        try:
            intersec_name = intersec.split(".")[-1]
            DisconnectUser(pub_sdedb, "ALL")

            Project_management(RH_Database+r'/KHUB.RH.'+ intersec, (pub_database_path+r'/KHUBPub.GEO.'+intersec_name), webmerc, transform_method, in_coor_system, "NO_PRESERVE_SHAPE", "", "NO_VERTICAL")
            AddMessage('{0} {1}'.format(intersec_name, "overwrote in publication database"))
            update_status_table(status_table, intersec_name, status_fields, "Overwrite in Publication Database", None)
                
        except ExecuteError:
            AddWarning('{0} {1}'.format(intersec_name, "produced an error"))
            message = GetMessages(2)
            update_status_table(status_table, intersec_name, status_fields, "Error Occurred", message)
            continue
            
            
            
def RecreatePubGDB(target):
    from arcpy import CreateEnterpriseGeodatabase_management, CreateDatabaseConnection_management, CreateDatabaseUser_management, DisconnectUser
    DBName = "KhubPub"
    if target == 'Prod':
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\Prod\GeoAdmin@KHPubProd.sde'
        pubinstance = r"khdbpubprod\khubpub_prod"

    elif target == 'Test':
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\KHTest\GeoAdmin@KHPubTest.sde'
        pubinstance = r"khdbpubtest\khubpub_test"

    else:
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\KHDev\GeoAdmin@KHPubDev.sde'   
        pubinstance = r"khdbpubdev\khubpub_dev"
        
    
    sapass = raw_input('sysadmin password')
    sdepass = raw_input('sde password')
    geopass = raw_input('Geo password')
    CreateEnterpriseGeodatabase_management("SQL_Server", pubinstance, DBName, "DATABASE_AUTH", "geo_admin", sapass, "SDE_SCHEMA", "sde", sdepass, "", r"C:\Users\planadm\GISDATA\keycodes\License10.6\sysgen\keycodes")
    CreateDatabaseUser_management(pub_database,  "DATABASE_USER", "geo", geopass)
    CreateDatabaseUser_management(pub_database,  "DATABASE_USER", "readonly", "readonly")
    #geo and readonly users are granted with appropriate privileges in the instance model database, which should therefore be passed to the geodatabse privilege when added to the geodatabase
    
def CreatePrimaryNetwork(target):
    # Create_Dominant_Routes.py (ArcMap Only)
    #  
    # Description: Create concurrency table (WITH STATE SYS INV DIR ROUTES ONLY)
    # and copies Prefixes 1,2,3 Routes to the local geodatabase C:\\\\temp\\CONCURRENT.gdb
    # https://community.esri.com/thread/227849-export-only-dominant-routes-from-lrs

    # utilizes workspace C:\\\\temp\\CONCURRENT+YYYYMMDD.gdb - attempts to delete if not existing
    # ---------------------------------------------------------------------------
    from datetime import datetime
    
    if target == 'Prod':
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\Prod\Geo@KHPub.sde'
        pubinstance = r"khdbpubprod\khubpub_prod"
        pub_sdedb = r'C:\Users\planadm\GISDATA\DbConnections\Prod\sde@KHPubProd.sde'
        tds = r'C:\Users\planadm\GISDATA\DbConnections\Prod\TDSUser@KHProd.sde'
        RH_Database = tds

    elif target == 'Test':
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\KHTest\Geo@KHPubTest.sde'
        pubinstance = r"khdbpubtest\khubpub_test"
        pub_sdedb = r'C:\Users\planadm\GISDATA\DbConnections\KHTest\sde@KHPubtest.sde'
        tds = r'C:\Users\planadm\GISDATA\DbConnections\Prod\TDSUser@KHProd.sde'
        RH_Database = tds

    else:
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\KHDev\Geo@KHPubDev.sde'   
        pubinstance = r"khdbpubdev\khubpub_dev"
        pub_sdedb = r'C:\Users\planadm\GISDATA\DbConnections\KHDev\sde@KHPubDev.sde'
        tds = r'C:\Users\planadm\GISDATA\DbConnections\Prod\TDSUser@KHProd.sde'
        RH_Database = tds
        target = 'Dev'
    
    from arcpy import (env, da, CreateFileGDB_management, AddMessage, AddWarning, Exists, MakeRouteEventLayer_lr, DisconnectUser,
    CheckExtension, CheckOutExtension, CreateRoutes_lr, ChangePrivileges_management, CopyFeatures_management, Delete_management, 
    MultipartToSinglepart_management, Copy_management, Merge_management, MakeTableView_management, Erase_analysis, ExecuteError, 
    GetMessages, LocateFeaturesAlongRoutes_lr, CheckInExtension, MakeFeatureLayer_management, FeatureClassToFeatureClass_conversion, 
    CreateTable_management, Append_management, AddField_management, CalculateField_management, TableToTable_conversion, Project_management,
    Dissolve_management, Statistics_analysis, AddJoin_management, RemoveJoin_management, AddIndex_management)
    from datetime import datetime
    
    wsruncode = datetime.now()
    dbname = "CONCURRENT"+str(target)+str(wsruncode.year).zfill(4)+str(wsruncode.month).zfill(2)+str(wsruncode.day).zfill(2)
    ws= env.workspace=r"C:\\temp\\"+dbname+".gdb"
    if Exists(ws):
        try:
            Delete_management(ws)
        except: 
            raise LicenseError
            CreateFileGDB_management(r"C:/temp", dbname, "CURRENT")
    else:
        CreateFileGDB_management(r"C:/temp", dbname, "CURRENT")

    env.MResolution = 0.000000018939394
    env.MTolerance = 0.000000621369949
    #arcpy.env.XYTolerance="0.003280833333333 US Survey Foot"
    #arcpy.env.XYResolution="0.0001 US Survey Foot"

    db_connection=RH_Database
    env.overwriteOutput= True;

    try:
        if CheckExtension ("Highways")== "Available":

        #Check out the Network Analyst extension
            CheckOutExtension("Highways")
        else:
            raise LicenseError
    except LicenseError:
        print ("Roads & Highways Extension not available")
        sys.exit()

    LRSCounty = db_connection+"\\KHUB.RH.LRS_County"
    KHUBRoutes = "KHUBRoutes"
    KHUBRoutesLyr="KHUBRoutesLyr"
    
    from locref import CalculateRouteConcurrencies
    whereClauseAll= "Prefix IN ('1','2','3','4','5')"
    MakeFeatureLayer_management (LRSCounty,KHUBRoutesLyr,whereClauseAll)
    CalculateRouteConcurrencies(KHUBRoutesLyr, "temp_table", "", "FIND_DOMINANCE")
    if Exists(ws + "\\concurr_tbl_all"):
        # Delete it if it does exist
        Delete_management(ws + "\\concurr_tbl_all")
    CreateTable_management(ws, "concurr_tbl_all", "temp_table")
    Append_management("temp_table","concurr_tbl_all")
    AddField_management("concurr_tbl_all", "Prefix", "TEXT", "", "", "1", "", "NULLABLE", "NON_REQUIRED", "")

    CalculateField_management("concurr_tbl_all", "Prefix", "Mid( [RouteId],4,1 )", "VB", "")
    print ("concurrent table created")
    DisconnectUser(pub_sdedb, "ALL")
    if Exists(pub_database+r'/KHUBPub.GEO.RouteConcurrency'):  
        #make a copy to update in TDSOutput Route Concurrency in Tdsuser schema
        Delete_management(pub_database+r'/KHUBPub.GEO.RouteConcurrency')
    else:
        pass
    TableToTable_conversion("concurr_tbl_all", pub_database, "RouteConcurrency", "ToDate IS NULL")
    #make a copy to update in TDSOutput Route Concurrency in Tdsuser schema
    ChangePrivileges_management((pub_database+r'/KHUBPub.GEO.RouteConcurrency'), "readonly", View="GRANT", Edit="")
    FeatureClassToFeatureClass_conversion(KHUBRoutesLyr, ws, KHUBRoutes, "LRSToDate IS NULL AND Prefix IN ( '1', '2', '3','4','5') ")
    
    
    CheckInExtension("Highways")
    print("copy fc completed")


    # Local variables:
    TargetRoutesPrefix1235 = ws+"\\KHUBRoutes"
    concurr_tbl = ws+"\\concurr_tbl_all"
    DominantFlag1Feat = ws+"\\DominantFlag1Feat"
    TargetRoutes_Erase = ws+"\\TargetRoutes_Erase"
    TargetRoutes_Erase_LocateFea = ws+"\\TargetRoutes_Erase_LocateFea"
    TargetRoutes_Erase_withMP = ws+"\\TargetRoutes_Erase_withMP"
    SegmentsToCreateRoutes = ws+"\\SegmentsToCreateRoutes"
    mergeset = ws+"\\mergeset"
    mergesetCopy = ws+"\\mergesetCopy"
    #DominantRoutesCreated = ws+"\\DominantRoutesCreated"
    DominantRoutesAll = ws+"\\DominantRoutesAll"
    DominantRtesSinglePart=ws+"\\DominantRtesSinglePart"

    concurr_tbl_View = "concurr_tbl_View"
    concurr_tbl_View_Events = "concurr_tbl_View Events"
    Output_Event_Table_Properties = "RID LINE FMEAS TMEAS"
    TargetRoutes_Erase_LocateFea_Events = "TargetRoutes_Erase_LocateFea_Events"
    mergesetCopy_Layer = "mergesetCopy_Layer"

    ##############################create Dominant route segments###########################################################

    # Process: Make Table View to select subset with query to include only  LRS_end_date is  null, dominant flag=1
    MakeTableView_management(concurr_tbl, concurr_tbl_View, "FromMeasure IS NOT NULL AND ToMeasure IS NOT NULL AND ToDate IS NULL AND DominantFlag = 1 AND RouteId NOT LIKE 'KTA*' AND Prefix IN ('1','2','3','4','5')")
    MakeRouteEventLayer_lr(TargetRoutesPrefix1235, "RouteId", concurr_tbl_View, "RouteId LINE FromMeasure ToMeasure", concurr_tbl_View_Events, "", "ERROR_FIELD", "NO_ANGLE_FIELD", "NORMAL", "ANGLE", "LEFT", "POINT")
    CopyFeatures_management(concurr_tbl_View_Events, DominantFlag1Feat, "", "0", "0", "0")

    ###############Erase analysis to get route segments that do not carry any route##########################################

    # Process: Erase, input is all routes, erase feature is dominant routes#######################################
    Erase_analysis(TargetRoutesPrefix1235, DominantFlag1Feat, TargetRoutes_Erase, "")
    # Process: The noncarrying route segments are run thr LFAR-Locate Features Along Routes to get LRS milepoints, output is a table
    LocateFeaturesAlongRoutes_lr(TargetRoutes_Erase, TargetRoutesPrefix1235, "RouteId", "0 Feet", TargetRoutes_Erase_LocateFea, Output_Event_Table_Properties, "FIRST", "DISTANCE", "ZERO", "FIELDS", "M_DIRECTON")
    MakeRouteEventLayer_lr(TargetRoutesPrefix1235, "RouteId", TargetRoutes_Erase_LocateFea, "rid LINE fmeas tmeas", TargetRoutes_Erase_LocateFea_Events, "", "ERROR_FIELD", "NO_ANGLE_FIELD", "NORMAL", "ANGLE", "LEFT", "POINT")
    CopyFeatures_management(TargetRoutes_Erase_LocateFea_Events, TargetRoutes_Erase_withMP, "", "0", "0", "0")
    print("completed erase fc, dominant fc")
    Copy_management(TargetRoutes_Erase_withMP, SegmentsToCreateRoutes, "FeatureClass")

    # Process: Add Field ToMeasure. LFAR GIVES IT AS TMEAS
    AddField_management(SegmentsToCreateRoutes, "ToMeasure", "DOUBLE", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")
    CalculateField_management(SegmentsToCreateRoutes, "ToMeasure", "[TMEAS]", "VB", "")

    AddField_management(SegmentsToCreateRoutes, "FromMeasure", "DOUBLE", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")
    CalculateField_management(SegmentsToCreateRoutes, "FromMeasure", "[FMEAS]", "VB", "")

    # Process: Merge output of erase, and the dominant rte seg, COMMON FIELDS WILL BE RouteId, FromMeasure,ToMeasure
    Merge_management([SegmentsToCreateRoutes,DominantFlag1Feat], mergeset)
	
	
                           
    print ("merge completed")

    CopyFeatures_management(mergeset, mergesetCopy, "", "0", "0", "0")
    MakeFeatureLayer_management(mergesetCopy, mergesetCopy_Layer, "\"Shape_Length\" > 0")

    CreateRoutes_lr(mergesetCopy_Layer, "RouteId", DominantRoutesAll, "TWO_FIELDS", "FromMeasure", "ToMeasure", "UPPER_LEFT", "1", "0", "IGNORE", "INDEX")

    print ("routes creation completed")

    #append local routes from transactioal to the mergeset
    MakeFeatureLayer_management(LRSCounty, "LRS_County_Layer678", "Prefix IN ('6', '7', '8') AND LRSToDate is null")
    Append_management('LRS_County_Layer678', DominantRoutesAll, "NO_TEST")
	
    # Create singlepart features
    MultipartToSinglepart_management(DominantRoutesAll,DominantRtesSinglePart)
    AddField_management(DominantRtesSinglePart, "MinMeas", "DOUBLE", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")
    AddField_management(DominantRtesSinglePart, "MaxMeas", "DOUBLE", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")
    CalculateField_management(DominantRtesSinglePart, "MaxMeas", "round(!Shape.extent.MMax!,7)  ", "PYTHON_9.3", "")
    CalculateField_management(DominantRtesSinglePart, "MinMeas", "round(!Shape.extent.MMin!,7)  ", "PYTHON_9.3", "")
    print ("single parts fc with measures created ")

    #more local variables for just the state route part
    TargetRoutesPrefix123 = ws+"\\SHSRoutes"
    whereClauseSHSPrimaryInv= "Substring(RouteId, 4, 1) in ('1', '2', '3') AND Substring(RouteID, 11, 1) = '0'"
    whereClauseSHS = "LRSToDate IS NULL AND Prefix in ('1', '2', '3')"
    SHSPI = ws+"\\SHSDomInv"
    countyDissolve = ws+ "\\LRS_County_Dissolve"
    StateRoutes=ws+"\\LRS_State"
    CntyDIOnState = ws+"\\CountyDomOnState"
    MinStatCntyDIOnState = ws+"\\CountyDomOnState_Statistics"
    #this above shouldnt really be a variable, see line 460 and the join at 469 and calculalte at 470.  The name needs to be 
    
    
    # Create State Routes from County Routes - adding @427 on 12/24/2019
    FeatureClassToFeatureClass_conversion(KHUBRoutesLyr, ws, "SHSRoutes", whereClauseSHS)
    FeatureClassToFeatureClass_conversion(DominantRoutesAll, ws, "SHSDomInv", whereClauseSHSPrimaryInv)
    #FeatureClassToFeatureClass_conversion (TargetRoutesPrefix1235, TargetRoutesPrefix123, whereClauseSHS)
    #dissolve needs to filter to just SHS, prefix 4 and 5 should not be included
    Dissolve_management(TargetRoutesPrefix123, countyDissolve, "Prefix;RouteNum;Suffix;UniqueId;InvDir", "", "MULTI_PART", "DISSOLVE_LINES")
    AddField_management(countyDissolve, "StateRouteID", "TEXT", "", "", "12", "", "NULLABLE", "NON_REQUIRED", "")
    MakeFeatureLayer_management(countyDissolve, "StateUnique", "UniqueID is not null")
    MakeFeatureLayer_management(countyDissolve, "StateMainline", "UniqueID is null")
    CalculateField_management("StateMainline", "StateRouteID", "[Prefix]+ Right([RouteNum],3)+ [Suffix]+ [InvDir]", "VB", "")
    CalculateField_management("StateUnique", "StateRouteID", "[Prefix]+ Right([RouteNum],3)+ [Suffix]+ [InvDir]+ [UniqueId]", "VB", "")
    AddField_management(countyDissolve, "FromMeas", "DOUBLE", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")
    AddField_management(countyDissolve, "ToMeas", "DOUBLE", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")
    CalculateField_management(countyDissolve, "FromMeas", "0", "VB", "")
    CalculateField_management(countyDissolve, "ToMeas", "[Shape_Length]/5280", "VB", "")
    CreateRoutes_lr(countyDissolve, "StateRouteID", StateRoutes, "TWO_FIELDS", "FromMeas", "ToMeas", "LOWER_LEFT", "1", "0", "IGNORE", "INDEX")
    LocateFeaturesAlongRoutes_lr(SHSPI, StateRoutes, "StateRouteID", "0 Feet", CntyDIOnState, "RID LINE FMEAS TMEAS", "FIRST", "DISTANCE", "ZERO", "FIELDS", "M_DIRECTON")
    MakeTableView_management(CntyDIOnState, "CountyDomOnState_Matched", "Substring(RID, 2, 5) = Substring(RouteID, 7, 5) AND Substring(RID, 1, 1) = Substring( RouteId, 4, 1)", "", "#")
    Statistics_analysis("CountyDomOnState_Matched", MinStatCntyDIOnState, "FMEAS MIN;TMEAS MAX;OBJECTID COUNT", "RouteId;RID")
    #add index here
    #join the state measures into Dominant Routes All & SinglePart where they exist as state routes
    #also output, projected, the State Routes overlapping into the Publication Database as the State Route LRS - consider constructing Fields as used in the LRM System
    MakeFeatureLayer_management(DominantRtesSinglePart, "StateMeasures", whereClauseSHSPrimaryInv)
    AddField_management("StateMeasures", "FromState", "DOUBLE", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")
    AddField_management("StateMeasures", "ToState", "DOUBLE", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")
    AddField_management("StateMeasures", "FromCounty", "DOUBLE", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")
    AddField_management("StateMeasures", "ToCounty", "DOUBLE", "", "", "", "", "NULLABLE", "NON_REQUIRED", "")
    CalculateField_management("StateMeasures", "FromCounty", "[MinMeas]", "VB")
    CalculateField_management("StateMeasures", "ToCounty", "[MaxMeas]", "VB")
    AddIndex_management("DominantRtesSinglePart", "RouteID", "RouteID1")
    AddIndex_management(MinStatCntyDIOnState, "RouteID", "RouteID2")
    MakeTableView_management(MinStatCntyDIOnState, "JoinStats")
    AddJoin_management("StateMeasures", "RouteId", "JoinStats", "RouteId", "KEEP_ALL")
    CalculateField_management("StateMeasures", "DominantRtesSinglePart.FromState", "[DominantRtesSinglePart.MinMeas] + [CountyDomOnState_Statistics.MIN_FMEAS]", "VB")
    CalculateField_management("StateMeasures", "DominantRtesSinglePart.ToState", "[DominantRtesSinglePart.MaxMeas] + [CountyDomOnState_Statistics.MIN_FMEAS]", "VB")
    RemoveJoin_management("StateMeasures", "CountyDomOnState_Statistics")
    #output these results to the publication database
    DisconnectUser(pub_sdedb, "ALL")
    Project_management(DominantRoutesAll, (pub_database+r'/KHUBPub.GEO.'+"LRS_County_Primary"), webmerc, transform_method, in_coor_system, "NO_PRESERVE_SHAPE", "", "NO_VERTICAL")
    FeatureClassToFeatureClass_conversion(DominantRoutesAll, pub_database, "LRS_County_Primary_LCC", "")
    FeatureClassToFeatureClass_conversion(DominantRtesSinglePart, pub_database, "LRS_County_PrimarySP_LCC", "")
    Project_management(DominantRtesSinglePart, (pub_database+r'/KHUBPub.GEO.'+"LRS_County_PrimarySP"), webmerc, transform_method, in_coor_system, "NO_PRESERVE_SHAPE", "", "NO_VERTICAL")
    Project_management(StateRoutes, (pub_database+r'/KHUBPub.GEO.'+"LRS_State"), webmerc, transform_method, in_coor_system, "NO_PRESERVE_SHAPE", "", "NO_VERTICAL")
    ChangePrivileges_management((pub_database+r'/KHUBPub.GEO.LRS_County_Primary'), "readonly", View="GRANT", Edit="")
    ChangePrivileges_management((pub_database+r'/KHUBPub.GEO.LRS_County_PrimarySP'), "readonly", View="GRANT", Edit="")
    ChangePrivileges_management((pub_database+r'/KHUBPub.GEO.LRS_State'), "readonly", View="GRANT", Edit="")
    ChangePrivileges_management((pub_database+r'/KHUBPub.GEO.LRS_County_PrimarySP_LCC'), "readonly", View="GRANT", Edit="")
    ChangePrivileges_management((pub_database+r'/KHUBPub.GEO.LRS_County_Primary_LCC'), "readonly", View="GRANT", Edit="")
    status_table = (pub_database+r"/KHUBPub.GEO.Publication_Status")
    status_fields = ["DATE", "LAYER_NAME", "STATUS", "MESSAGE"]
    def update_status_table(status_table, data, fields, status_value, message_value):
        now = datetime.now()
        with da.InsertCursor(status_table, fields) as i_cur:  #@UnresolvedImport
            insert_values = (now, data, status_value, message_value)
            i_cur.insertRow(insert_values)
        del i_cur
            
    i = 1
    try:
         AddMessage('{0} {1}: {2} {3} {4}'.format("Primary Routes", "Primary routes updated in publication database", i, "of", len("Primary Routes")))
         update_status_table(status_table, "Primary Routes", status_fields, "Primary routes updated in publication database", None)
                

    except ExecuteError:
            AddWarning('{0} {1}: {2} {3} {4}'.format("Primary Routes", "produced an error", i, "of", len("Primary Routes")))
            message = GetMessages(2)
            update_status_table(status_table, "Primary Routes", status_fields, "Error Occurred", message)
            i += 1

    
def LYRS2Pub(target):
    # this function creates an output to publication of the pavement layers equivalent to LYRS in CANSYS.  There are over 96,000 records of pavement layers, and process takes over an hour to write the dataset to publication   
    from datetime import datetime
    #add datetime reporting/printing to check processing time on server
    #add logging and error messaging
    #add Change Privliege Management to add readonly privilege to output
    if target == 'Prod':
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\Prod\Geo@KHPub.sde'
        pubinstance = r"khdbpubprod\khubpub_prod"
        pub_sdedb = r'C:\Users\planadm\GISDATA\DbConnections\Prod\sde@KHPubProd.sde'
        tds = r'C:\Users\planadm\GISDATA\DbConnections\Prod\TDSUser@KHProd.sde'
        RH_Database = tds

    elif target == 'Test':
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\KHTest\Geo@KHPubTest.sde'
        pubinstance = r"khdbpubtest\khubpub_test"
        pub_sdedb = r'C:\Users\planadm\GISDATA\DbConnections\KHTest\sde@KHPubtest.sde'
        tds = r'C:\Users\planadm\GISDATA\DbConnections\Prod\TDSUser@KHProd.sde'
        RH_Database = tds

    else:
        pub_database = r'C:\Users\planadm\GISDATA\DbConnections\KHDev\Geo@KHPubDev.sde'   
        pubinstance = r"khdbpubdev\khubpub_dev"
        pub_sdedb = r'C:\Users\planadm\GISDATA\DbConnections\KHDev\sde@KHPubDev.sde'
        tds = r'C:\Users\planadm\GISDATA\DbConnections\Prod\TDSUser@KHProd.sde'
        RH_Database = tds
        
    from arcpy import env, MakeRouteEventLayer_lr, FeatureClassToFeatureClass_conversion, DisconnectUser, Delete_management, Exists
    
    env.MResolution = 0.000000018939394
    env.MTolerance = 0.000000621369949
    DisconnectUser(pub_sdedb, "ALL")
    if Exists(pub_database+r'/KHUBPub.GEO.ev_PavementLayers'):  
        Delete_management(pub_database+r'/KHUBPub.GEO.ev_PavementLayers')
    else:
        pass
    MakeRouteEventLayer_lr(tds+r"/KHUB.RH.LRS_County", "RouteId", tds+r"/KHUB.TDSUser.PS_PLV_Def", "RouteID LINE FromMeasure ToMeasure", r"PavementLayers", "", "ERROR_FIELD", "NO_ANGLE_FIELD", "NORMAL", "ANGLE", "LEFT", "POINT")
    FeatureClassToFeatureClass_conversion("PavementLayers", pub_database, "ev_PavementLayers")

def maintest():
    #LYRS2Pub('Test')
    Analyze()
    PostAndReconcile()  ##enable for nightly run
    Compress()          ##enable for nightly run
    Analyze()

def main():
    Analyze()
    PostAndReconcile()  ##enable for nightly run
    Compress()          ##enable for nightly run
    Analyze()
    #RebuildIndexes() #in testing
    #RecreatePubGDB('Dev')
    #RecreatePubGDB('Prod')
    PublicateNetworkProd2Target('Prod') ## update prod transactional data to publication in 'dev', 'test', or 'prod'
    CreatePrimaryNetwork('Prod')
    #PublicateNetworkProd2Target('Test')  
    #CreatePrimaryNetwork('Test')
    #PublicateNetworkProd2Target('Dev') 
    #CreatePrimaryNetwork('Dev')
    #LYRS2Pub('Prod')
    
    print("run completed" )


main()

I wanted to share an update to our scripted process, now we have included a point event for flagging opposite directions of inventory primary routes on dual carriagway.  By putting the event flag on the real primary noninventory route side, we then adjust the route concurrency table on the segment we flag, overriding the default concurrency rules that would have the inventory direction in both directions across the right of way for two different routes.  An example of this in KAnsas is where US54 intersects US69 and they run opposite on dual carrigway north of Fort Scott Kansas.  For us, US54 is the primary inventory route, so we flag US54 non inventory and alter US69 inventory to turn into US54 noninventory.

rather than paste all that code again, here is a link to the code on github:

pydev106/GeodbMaintenanceQC.py at master · KDOTGIS/pydev106 · GitHub 

another important step I want to point out in our proess is this other script to copy metadata updates beween dev, test, prod, transactional, and publiction environments,

pydev106/CopyMetadata.py at master · KDOTGIS/pydev106 · GitHub 

0 Kudos

also commented out here is a half baked process to derive a statewide state highway network from the county network.  I think we are working on altering the approach we are taking at that, but the process can be added with just a few additional steps in this process if those steps are done in just the right sequence.  I think we currently are ironing out our expectations and requirements for the data of this derivation process to serve interfaces, and we are thinking of adding an event to maintain the state network history as derived from the county network.  

0 Kudos