Select to view content in your preferred language

Python Execute Reviewer Batch Job script is slower in ArcGIS Pro

1874
7
11-17-2017 01:24 PM
deleted-user-ZTub55yWbsya
Deactivated User

Running arcpy stand-alone scripts with ArcGIS Pro (64-bit) is supposed to be faster than ArcGIS 10.5, but it seems to be the opposite with my first attempt at converting a stand-alone script to use ArcGIS Pro.  The script is long, but among other things it uses Data Reviewer to run a few attribute checks.  I've narrowed down where the difference is -  specifically on ExecuteReviewerBatchJob_Reviewer:

arcpy.ExecuteReviewerBatchJob_Reviewer(reviewer_gdb, currentSession, AttributesBatchJob, None, None, "ALL_FEATURES", None)

ArcGIS 10.5 takes 1 minute

ArcGIS Pro takes 5 minutes

Perhaps I'm doing it wrong, but my understanding is we should run a stand-alone script with Pro like so:

call "C:\Program Files\ArcGIS\Pro\bin\Python\scripts\propy.bat" C:\GISDataReview.py

Has anyone else seen negative impacts in running Python 3 vs 2.7 using the same script (and assuming syntax is corrected where needed for version 3)?   I wanted to compare apples to apples, but if it requires somehow re-working it for Pro I'm open to suggestions.  I made sure my Data Reviewer workspace is upgraded for Pro.  When I run the execute batch job geoprocessing tool inside Pro it finishes in less than a minute.  This script needs to run as a scheduled task.

0 Kudos
7 Replies
DanPatterson_Retired
MVP Emeritus

you could read for days on the 'which is faster...' debate.  In short, it depends... but overall 3.x, except for the thing that you are using.  This of course assumes pure python.  As soon as you throw other stuff into the mix then all bets are off and that would include anything Arc* related.  You could decorate your functions if your code is arranged that way.  I found it useful for comparing function I used between 2/3 versions

def time_deco(func):  # timing originally
    """timing decorator function
    :print("\n  print results inside wrapper or use <return> ... ")
    """
    import time
    from functools import wraps

    @wraps(func)
    def wrapper(*args, **kwargs):
        t0 = time.perf_counter()        # start time
        result = func(*args, **kwargs)  # ... run the function ...
        t1 = time.perf_counter()        # end time
        dt = t1 - t0
        print("\nTiming function for... {}".format(func.__name__))
        print("  Time: {: <8.2e}s for {:,} objects".format(dt, len(result)))
        return result                   # return the result of the function
        return dt                       # return delta time
    return wrapper
0 Kudos
deleted-user-ZTub55yWbsya
Deactivated User

Thanks, Dan.  I won't assume anything with ArcGIS Pro and stand-alone scripts.

0 Kudos
PeterSchoenfield
Occasional Contributor

I HAVE experienced the same kind of slowness in ArcGIS Pro and in stand-alone python when using TableToTable conversion to export to a DBF. I'm exporting 329518 address records. In ArcCatalog and Python 2.7 it takes about 5 seconds. After multiple attempts in ArcGIS Pro and its Python install I gave up after 10 minutes! Something obviously not right there.

0 Kudos
DanPatterson_Retired
MVP Emeritus

Neat... throw the decorator I posted around a def that contains your calls to TableToTable so you can time the same code in 2.7 and 3.5.  If would allow you to check whether there is any changes in arcpy between the two implementations.  It would be interesting to track these things.

0 Kudos
PeterSchoenfield
Occasional Contributor

I did some additional testing to record the times. Using ArcGIS Desktop python here are the results:

C:\Temp>python TimeTest.py
07:41:50 AM, Tuesday December 19, 2017
07:42:22 AM, Tuesday December 19, 2017
C:\Temp>

Then using the ArcGIS Pro python, same script:

C:\Temp>"C:\Program Files\ArcGIS\Pro\bin\Python\scripts\propy.bat" TimeTest.py
09:18:50 AM, Monday December 18, 2017
05:46:15 AM, Tuesday December 19, 2017
C:\Temp>

I think the results speak for themselves. I will be calling tech support about this.

0 Kudos
DanPatterson_Retired
MVP Emeritus

Bizarre!!! you do certainly have a time problem there.  Report back!  It would be interesting to find out what caused this

0 Kudos
DrewFlater
Esri Regular Contributor

Hi Peter,

I work at Esri and have done some testing with the TableToTable tool - I just tried a dataset with ~70k records, exported to a dbf. This took a little over one second. I believe there might be something specific to your data or the settings you are using with the tool that is causing this problem. Maybe your dataset has a lot of fields? My dataset only had about 20. Would it be possible for you to share the data with me, and the Python command you are running? I can set up a box folder to upload the data, just send me a message at dflater@esri.com

0 Kudos