Select to view content in your preferred language

Increasing latency

80
5
yesterday
BrianHunt1
New Contributor

Over the past few weeks myself and others working on ArcPro geologic map projects have noticed an increasing latency. We can create point and lines for perhaps an hour, but then it becomes increasing slow where points show up many seconds after a click. The issue is temporarily solved if we close the program and reopen, only to have to do it again within 30-60 minutes. Really frustrating and inefficient. This seems to be a recent development. We've tried some of the basics like clearing caches and working from local drives etc.

thoughts?

0 Kudos
5 Replies
George_Thompson
Esri Notable Contributor

Where is the data that you are editing hosted (Online, Enterprise, Network location, RDBMS, etc.)?

Did this happen for every editor or only a subset?

Are you working over a VPN?

Can you provide some more details (version of Pro / number of records / etc.)?

Thanks!

--- George T.
BrianHunt1
New Contributor

Hi, 

We are all working on our local C drives (SSD), and not through a VPN (although I do at times). We keep our data local, except for some of the ESRI basemaps. Although I was working off my server and moved it to my local drive last week to try and resolve the issue. While it did speed up a bit, the eventual latency issue remains. 

I'm using the latest ArcPro 3.5.4. The data I'm editing is not very large--some geologic map contacts. I've worked on much bigger datasets in the past.

thanks!

0 Kudos
MikeVolz
Frequent Contributor

Can you try downgrading ArcGIS Pro on one of these machines or get a test machine and install an older version of ArcGIS Pro (3.3.x or 3.4.x) to see if the latency existed in those recent versions?

0 Kudos
George_Thompson
Esri Notable Contributor

Appreciate that information above, it is helpful to narrow down possibilities.

Also, what is the data stored in on the local drive: file geodatabase / shp files / etc.?

--- George T.
0 Kudos
RTPL_AU
Honored Contributor

@BrianHunt1 
Yes. 
I've noticed it in 3.5.x  
I now restart Pro before doing anything 'new' - will do editing in one session then do layouts in a new session. If editing takes too long, I have to restart the session. 
Due to the variety of data I use (all on local network 99% of time) it's hard to pin down what causes it.

Most noticeable in the attribute table which gradually gets less responsive. Scrolling around in a dataset with ~250 records will become entangled in molasses. Copying & pasting text between entries can see the paste operation take a second per record (when the copy doesn't cause a freeze, see other posts).
Pulling up the colour palette will have the various colours populate one by one rather than just doing a full window pop-in.

Restart Pro and it behaves better for a while.  
There has to be something that triggers it - I've had sessions working with large elevation datasets or just layouts for a few hours with no issue. Move on to the next job and within an hour I can sell treacle.

@George_Thompson 
I have two active PCs on Pro 3.5.x 
Both very high end desktops with high end GPUs, SSDs, lots of RAM. One is Win 11, one Server 2025 (for testing other Pro issues).  10GBE network. Majority of data in FGDB. 
No VPN.  Local users only, no Entra/AD/domains. No OneDrive syncing of profile stuff.
Some data from AGOL - 500mbit fibre internet.
The second PC I mentioned was built from bare metal to run Pro 3.5.3 (now 3.5.4) using Pro .exe install for all users and manual patching from downloaded updates, with no previous Pro versions installed, new .Net (8.20), new local user account with no cloud sync or roaming profiles. 
Win 11 machine also has ArcMap installed and was updated to 3.5.x from 3.3.

Slow-down doesn't seem data location, size, or complexity dependent.
I use our state property FGDB often -  ~2.5GB and have terrains and DEM rasters of many GB, usually fast due to decent network setup. I can be editing a drillhole point dataset with 1000, 250, or 50 records and it will start getting very sluggish after a while.

The increase in lag is over & above the 'normal' Pro UI lag.  Some things always take a few seconds to work/show/etc. This is about a change in behaviour over time in a specific instance of Pro.  

You can have one or multiple instances of Pro open and one will exhibit the behaviour depending on what you're doing. Open the same data in another instance and it will perform as expected.

While Pro is being laggy you can open any other application (Solid Edge, Excel, Google Earth, Corel, QGIS) and it will not display any similar symptoms - both from a GPU point of view or network point of view (ruling out some potential issues with drivers or hardware). 
While Pro is misbehaving, I have not been able to find any corresponding logs on the file server (assume no share issues or server hardware issues) or in the Windows logs (no local driver, disk, or app issues). Unifi logs don't mention network based issues.

I mentioned QGIS - I can open the attribute table of a FGDB dataset that is open in Pro and being laggy, and it is very fast in QGIS. Same goes for ArcMap. I can open ArcMap on the same Win 11 PC and start an editing session on the same data (after stopping the edit in Pro) and see no performance issues.

It is hard to compare this behaviour with 3.3.x as that version suffered from the excessive license checking bug. In some ways the symptoms were very similar but with 3.3.x you could see the repeated web calls in the Diag Monitor. 
With the 3.5.x issue I don't see any errors in Diag Mon; just see tasks/steps/calls subjectively take longer to complete.


George - not sure which department you are in but would it be feasible for Esri to create a reference dataset and then publish some basic execution times for typical configurations using a few GP tools or operations?

I know there is the testing toolkit that runs a few operations on a supplied FGDB dataset but I'm talking about something much simpler. 
The key is the expected execution times that we can then use to work out if what we see is in the Esri expected ballpark or not. 
If you have a dataset and say it should 'process' in 10 sec when done across a typical 1GBE network from a  HDD-based share and I see 30s, or 1s, then I have something to work with.
The sample data should be in FGDB with a large point, polygon (property or geology...), and raster datasets. We can test joins, def queries, symbology, and a few raster functions (hillshade, section, extract elevation to points). 

Run 3 scenarios:
*Laptop with iGPU (AMD Strix Halo, not the dodgy ones), single decent SSD, 32GB RAM, local data.
*Business desktop with typical i7 (or whatever naming Intel want to use that week), 32GB RAM, basic SSD, RTX5070 across 1GBE to decent HDD share.
*High-end desktop with Threadripper Pro, 256GB RAM, RTX Pro 6000, multiple PCIeGen5 SSD, 10GBE to SSD-based share (to actually get the 1GB/s possible), etc.
Don't get into the weeds on Windows versions, driver versions - just give a time that a properly set up system will achieve.

I know benchmarking is complicated but having something that can be used as a ballpark reference will be very handy.

Just had a thought - Gamify it!
Issue a reference dataset and publish a notebook to run a bunch of operations. Collect the results with Survey123 and have a leaderboard for a few hardware spec categories. Every year the winner gets a mention at UC and a prize (coozies, and/or free licenses depending on the category). 
Entrants are free to optimise the data and/or scripts but have to publish changes along with hardware config.
You can bring in sponsors like Falcon Northwest, Puget Systems, AMD, Nvidia, Kioxia. This can be HUGE!!!!

 

 

0 Kudos