Running scripts in cloud environment really slow?

408
2
Jump to solution
04-01-2019 09:00 PM
deleted-user-yC5VkbyXzrQR
Occasional Contributor

Hello, 

I had a general question regarding running scripts and writing to cloud environment (azure) and if anyone had ideas to make it run faster. 

So when I run a script that has various GP tools chained and outputs being saved in a cloud environment it runs almost 5x slower than if I save the outputs on my local drive. I go from 25 seconds saving outputs to local to 3 minuets saving on the cloud environment. 

I know writing to the cloud will always be slower but by that much ? Is this normal. 

Any tips on how to improve performance?

This is only a small script, a larger one that will take 5 mins on local drive would almost for sure be 5x slower. 

Regards, 

Adam

0 Kudos
1 Solution

Accepted Solutions
JoshuaBixby
MVP Esteemed Contributor

I would say what you are seeing is expected, possibly even better than expected.  The performance of outputting data depends on many factors ranging from the size and structure of the data itself to the throughput and latency of the storage tier.

Many ArcGIS geoprocessing tools stream outputs to "disk" as they are generated/created, which makes sense because creating the output completely in memory before outputting it to disk isn't doable for many datasets.  Streaming outputs to disk continually results in smaller increments but more frequent disk activity.  Given that wide-area networks tend to operate on 10s or 100s of milliseconds of latency, and local disks tend to operate with substantially higher throughput and < 1 millisecond latency; it is expected that outputting geoprocessing tools straight to a cloud environment will be slow when the machine doing the processing is not in the cloud.

I would suggest either running the analyses locally and copying just the final products to the cloud environment at the end, and/or process more of your intermediate datasets in-memory (if possible) before copying the final results to a cloud environment. 

Also, you could spin up ArcGIS on a cloud-based virtual desktop and do the processing and exporting completely within the cloud.

View solution in original post

2 Replies
DanPatterson_Retired
MVP Esteemed Contributor

ArcGIS Pro on Microsoft Azure Cloud—ArcGIS Pro | ArcGIS Desktop 

not much mention of processing times, seems to mention rendering a lot

There is more info in the system requirements section

ArcGIS Pro 2.3 system requirements—ArcGIS Pro | ArcGIS Desktop 

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

I would say what you are seeing is expected, possibly even better than expected.  The performance of outputting data depends on many factors ranging from the size and structure of the data itself to the throughput and latency of the storage tier.

Many ArcGIS geoprocessing tools stream outputs to "disk" as they are generated/created, which makes sense because creating the output completely in memory before outputting it to disk isn't doable for many datasets.  Streaming outputs to disk continually results in smaller increments but more frequent disk activity.  Given that wide-area networks tend to operate on 10s or 100s of milliseconds of latency, and local disks tend to operate with substantially higher throughput and < 1 millisecond latency; it is expected that outputting geoprocessing tools straight to a cloud environment will be slow when the machine doing the processing is not in the cloud.

I would suggest either running the analyses locally and copying just the final products to the cloud environment at the end, and/or process more of your intermediate datasets in-memory (if possible) before copying the final results to a cloud environment. 

Also, you could spin up ArcGIS on a cloud-based virtual desktop and do the processing and exporting completely within the cloud.