We have 3 FC and one related table and we are thinking about moving the table and feature classes to ArcGIS Portal as hosted feature layers in order to digitize records using Web AppBuilder and Smart Editor Widget.
The table, is going to be a read only table and needs to to be refreshed daily, with data imported from other tabular based applications.
What is the best way to refresh the table in Portal?
- Process everything in in SDE, truncate/append the hosted table in Portal with the table from SDE?
If you're just refreshing the table, you might consider using the ArcGIS Python API.
How you use it depends a lot on the source of the data, but since it's non-spatial, you've got a lot of options. Can you elaborate on where the data's coming from?
If it's relevant to your process, we use pandas to update hosted tables in much the same way. Here's one example:
import pandas as pd from arcgis import GIS gis = GIS('your-org-url', 'user', 'password') hosted_table = gis.content.get('hosted-service-itemID').tables db_constr = 'mssql+psycopg2://user:pass@hostname/db' query = 'SELECT * FROM Database.dbo.SomeTable' df = pd.read_sql_query(query, db_constr) # Do some data manipulation as needed here hosted_table.manager.truncate() i = 0 # Append new records in chunks while i < len(df): fs = df.iloc[i:i+100].spatial.to_featureset() adds = hosted_table.edit_features(adds=fs) i += 100
Also: if you don't want to do a full truncate, it's possible to use the same python modules to identify rows to update in place rather than delete and re-add. But that gets a bit more complex. Let me know if you'd like elaboration on that.
Then just get your python script scheduled to run daily on a machine that has access to the source data.
Final note: supposing your source data isn't a true database, there are numerous other ways of getting tabular data into a dataframe.
Hello @JoseSanchez ,
If possible you may want to register your enterprise geodatabase with ArcGIS Server. This would allow you to publish a feature service that references source data from your enterprise geodatabase. Changes to the underlying source data would then be reflected in the feature service. Requirements to register the enterprise geodatabase vary depending on the RDBMS system.
I'm doing it daily on a HFL we have but the size of it is spiraling out of control. Every new data refresh the size doubles...how do I trim it down? the number of features/rows remains about the same but the file size doesn't
We do everything in SQL server on the back end. I have a feature class that resides in an SDE database that is then published as a feature service. As the SQL server updates the related tables the feature service automatically updates as well. These feature services work GREAT in Map Viewers and Web App Builder but within XP Builder the feature service is EXTREMELY slow. More research needed there but it makes no sense as to why it creep in XP only.