data expressions and data size

228
1
02-21-2024 02:16 PM
Labels (1)
clt_cabq
Occasional Contributor III

One of the caveats mentioned about using data expressions in a dashboard is that performance can be impacted depending on the size of the data, particularly in the sense of the number of records, though the number of records and perhaps the type and length of data fields probably comes into play at some point. What is thought to be 'too much' data to handle with a data expression? Has anyone played with putting a value on the number of records that can be handled through a data expression without compromising the performance of a dashboard?

1 Reply
AlexanderDanielPratama
Esri Contributor

Are you using ArcGIS Online or ArcGIS Enterprise? If you are using ArcGIS Enteprise, I believe you can put change your service pooling or other parameters in that related service Tune and configure services—ArcGIS Server | Documentation for ArcGIS Enterprise. However, I think it's not the best practice in the short term.

If it still not satisfying, usually I created a task scheduler in the backend to record the old from the business table to the backup table. In my previous client, I had more than 1 million rows, due to the update of data are fast and massive. So I created a task scheduler ran every 6 hours to record more than 3 days old data.

Hope, it helps you

Cheers 

0 Kudos