Dynamic Dashboard/Automated Dashboard

654
10
09-10-2021 10:11 AM
AnnettePoole1
New Contributor II

Hello, We are using Enterprise 10.8.1.  For dashboards we have created.  we want them to automatically/dynamically update.  We  connected our SQL server to ArcPRO and added the SQL data to the ArcGIS Datastore.  It is our understanding that the only way to dynamically update a dashboard is via the datastore.  This step is now complete. 

We are now working on the analyst work flow in using the data.  I would like to know what the supported workflow is in using datastore data to create dashboards. This is what we have found. 1.  We usually bring in the data to ArcPRO, then display x,y to map points, then publish, then create dashboard.  2.  Now with the datastore, we followed the same process display x,y, create feature layer, publish to enterprise but when creating the dashboard and trying to create a serial chart we received an error that the data could not be used for statistics.   We discovered that using the tool Make Event Layer vs Display x, y when mapping the data worked.  In other words, we used Make Event Layer, published data, then created dashboard.  The serial widget worked with the data.  So my question is, is that how to create dynamic dashboards by using Make Event Layer, and you cannot use the display x,y to create a feature layer?  Thank you

0 Kudos
10 Replies
forestknutsen1
MVP Regular Contributor

I am not sure how helpful this will be for you.

But we just created a dashboard that automatically refreshes x, y events. But the pattern is a little different.

The process:

1. Update a table in Oracle with latitude, longitude, and other attributes with python batch job

2. Using ArcMap create the x, y events, symbology, etc. and publish the service to portal/server

3. Create map and set refresh intervals.

4. Create dashboard

0 Kudos
AnnettePoole1
New Contributor II

Hello, 

Thank you for your response.  Yes that is the workflow we use but use SQL. 

Question when you "create the x, y events"  what tool are you using?

When we use display x,y and create a feature layer,  the dashboard doesn't work, but when we use the geoprocessing tool "make event layer" it does work.

Thank you,

Annette

 

 

 

 

 

 

 

 

 

 

 

 

 

 

0 Kudos
forestknutsen1
MVP Regular Contributor

I am just right clicking the table in the ArcMap table of contents and selecting display x,y events 

forestknutsen1_0-1631298321771.png

then I publish it as a map service. So, on the Portal side I end up with a "Map Image Layer"

0 Kudos
AnnettePoole1
New Contributor II

That's exactly how we were doing it before.  I did try to publish it as a map service and it didn't work, probably due to permissions.  Good to know though that you can do it with display x,y. And I see the python script, are you using the python script as well? In the above example? thank you very much, this has been very helpful

 

0 Kudos
AnnettePoole1
New Contributor II

Hi Forestknutsen1,

Can you show me how you publish it as a map service.  I notice you are in ArcMAP, we are using ArcPRO. But I just opened ArcMAP to try it.  Thank you.

0 Kudos
jcarlson
MVP Regular Contributor

I'd look into using the ArcGIS Python API, if that is an option for you. You can easily update your hosted layer in place, without opening desktop software or re-running any GP tools.

from arcgis import GIS
from arcgis.features import GeoAccessor
import pandas as pd

# Log in to Portal
gis = GIS('your-portal-url', 'user', 'password')

# Get hosted layer; replace '0' w/ whatever the layer index is
fl = gis.content.get('item-id-of-layer').layer[0]

# Query out from your SQL database
sql = """
    some sql query string
    SELECT whatever FROM wherever WHERE something=1
"""

conn = "some connection string" # see sqlalchemy docs for what this should be

df = pd.read_sql(sql, conn)

# Convert to spatial dataframe using x/y columns
sdf = GeoAccessor.from_xy(df, x_column, y_column, sr=4326)

 

With that spatial dataframe, you can apply the resulting features to your layer quite simply.

fl.edit_features(adds=sdf.spatial.to_featureset())

 

If there are issues with duplicates from your query, you can truncate the hosted layer prior to adding new features.

- Josh Carlson
Kendall County GIS
forestknutsen1
MVP Regular Contributor

So, I started with a similar strategy. But I was using ArcGIS Online as the gis target. It worked well over all. I had the job set to run every few minutes. The problem was that the connection would mess up (time out) every few days, creating a hole in the data updates. I found the AGS service to be far more robust. But maybe this strategy would work well for Annette.

 

This process for me was:

Add csv to Online by hand and create a hosted feature layer:

forestknutsen1_0-1631299601981.png

forestknutsen1_1-1631299682012.png

forestknutsen1_2-1631299713382.png

Then update with a python batch job.

This is my simple proof of concept script (I have the full dev one as well if you want to see it - the prototype might be faster to digest):

 

from arcgis import GIS
from arcgis.features import FeatureLayerCollection
import pickle

original_pickle = 'original'
in_file = open(original_pickle, 'rb')
original = pickle.load(in_file)
in_file.close()
print(original)
if original:
    city = 'city.csv'
    time = 'time.csv'
else:
    city = 'city2.csv'
    time = 'time2.csv'

gis = GIS(url=xxxx, username=xxxx, password=xxxx)


my_content = gis.content.search(query='owner:' + gis.users.me.username,
                                item_type='Feature Layer Collection',
                                max_items=100)
cities_item, = [x for x in my_content if x['title'] == 'city']
time_item, = [x for x in my_content if x['title'] == 'time2']

print('updating city layer to: {}'.format(city))
cities_flayer_collection = FeatureLayerCollection.fromitem(cities_item)
cities_flayer_collection.manager.overwrite(city)
print('updating time layer to: {}'.format(time))
cities_flayer_collection = FeatureLayerCollection.fromitem(time_item)
cities_flayer_collection.manager.overwrite(time)
print('done')

original = not original
out_file = open(original_pickle, 'wb')
pickle.dump(original, out_file)
out_file.close()

 

 

 

0 Kudos
AnnettePoole1
New Contributor II

Thank you.  I will see if we can try this as well.

0 Kudos
AnnettePoole1
New Contributor II

We are federated, are you federated or stand-alone? It seems, but I am not 100% sure, that publishing a service works for stand-alone.   Thank you. 

0 Kudos