What is the best practice for creating a live updating web layer NOT using geoevent server.

2049
6
Jump to solution
03-21-2019 01:28 PM
KevinChristy1
Regular Contributor

Good day!

So here is the deal.

I have several map services and feature services on my organizations Portal for ArcGIS that update live, roughly every couple of seconds or so. However, they respond fairly slowly and I want to ensure that I am following best practices.

The breakdown:

The data comes from MS SQL Server 2012. I bring the data into ArcGIS Pro in the form of views. Sometimes these views can reference several tables from different databases. These databases are being continuously updated. The databases are registered with the server.

Each view is geo-enabled. As a result, each record shows up as a point on the map much like a feature class.

I bring the view into ArcGIS Pro, format and symbolize it how I wish, and then publish it as a map service onto my server and Enterprise Portal.

I then reference the REST endpoint of the map service, https://servername.stpete.org/arcgis/rest/services/mapservice_name/MapServer/0 to bring it into my enterprise portal as a feature service.

The result is a web layer derived from a SQL View that live updates.

Is this the best way to create a live updating layer from MS SQL Server? It functions and can be brought into Operations Dashboard and other Web Apps and has full functionality, however, it responds very slowly. It can sometimes take 30-45 seconds to load a web layer that only has 30-ish records.

Any thoughts?

Thank you in advance.

1 Solution

Accepted Solutions
KevinChristy1
Regular Contributor

So we ended up using tables. To have the data up to date, we run a process every morning that truncates the table and then repopulates the data. We have found that the speed is greatly increased. Thanks for all of your responses.

View solution in original post

6 Replies
TanuHoque
Esri Regular Contributor

Kevin Christy

there are few things I'd do to find out the bottleneck:

hope this helps.

KevinChristy1
Regular Contributor

Tanu,

Thanks for responding!

The draw times for the map image layer and the resulting feature service layer are identical. Furthermore, the map image layer is not truly supported in Operations Dashboard.

The draw time in ArcGIS Pro is fairly comparable to the draw times in Portal.

And finally, the time it takes to execute the view in MS SQL Server; the longest view (177k records) takes 26 seconds. While the others normally are between instant and 3 seconds or so. My dashboards almost exclusively use the web layers generated from the 0-3 second views.

0 Kudos
TanuHoque
Esri Regular Contributor

Thanks Kevin,

are you saying that views that take between 0-3 sec, when execute from sql server, take about 30 sec when used in Pro or in map service?

If so, can you pls use SQL Server Profiler to check what is the difference between 'when you execute from SQL Server' and 'when you draw from Pro'?

this might help you narrow down.

KevinChristy1
Regular Contributor

Thanks Tanu, will do. It may take us a while to run that (due to permissions within our Organization and its Friday), but I will update the result of that here when it gets done.

That being said, is this process the best practice for a live updating web layer? Is there a better way to be doing this?

Thanks.

0 Kudos
TanuHoque
Esri Regular Contributor
is this process the best practice for a live updating web layer? Is there a better way to be doing this?

as you have already mentioned, as far as i know, stream layer off a stream service is the best approach for real time data.

But for your case, it seems like what you did is a valid option. If I recall correctly, I was told there were few other users who took the same approach to view (near) real time data at a regular interval (of course there is some latency - which makes it not-real-time by definition ), and there were fine with that latency. In the end it all depends on your particular use case.

0 Kudos
KevinChristy1
Regular Contributor

So we ended up using tables. To have the data up to date, we run a process every morning that truncates the table and then repopulates the data. We have found that the speed is greatly increased. Thanks for all of your responses.