Help! Trying to come up with a data management solution via REST and Hosted Feature Layers. Could this work?

01-23-2024 08:34 AM
Occasional Contributor

Hey all,

Not sure if this is the best forum to post this in, but thought it could be a good start. Hoping to brainstorm some ideas with the community for how I can come up with a solution to this problem we're facing.


We're using a third-party LMS system which works great, however data is only retained at a maximum of 12 months, after which it gets deleted/becomes inaccessible. This is a problem as it's crucial for us to know historical LMS completion (i.e., compliance) values for registered people.

Potential Solution

Thankfully this site has an API - yay! However, the data returned in the API isn't as robust as I'd hoped. However for each registered individual, I could pull their compliance values (e.g., 23 of 25 required modules completed - 92% compliance) at weekly intervals, thus building an internal copy of the LMS records that we can retain for longer than 12 months.

I'm thinking of building a script in Python to make the required API calls (returns a dictionary), and then pushing that payload to a hosted feature layer in AGO where the parent table has the name of the individual and the child table has bi-weekly compliance scores of that individual (thus 26 child records per year).

Potential Workflow

I haven't built a database through REST calls before, but at a very high level, my thoughts are:

  1. Build a Python script that iterates through a list of registered LMS users and performs the API call
  2. Use the dictionary returned from the API call to push data to the hosted feature layer
    1. This should check for new people so it can add, existing people so it can update, and those who were removed from the system so it can retire them
  3. Schedule it at bi-weekly intervals.

Potential Issues

There are quite a large number of users in our LMS (> 2,000). Using 2,000 as a benchmark, the resulting feature layer would have 2,000 parent records and 52,000 child records annually (if running bi-weekly).

I realize ArcGIS Online isn't an ideal place for storing a dataset that large. And maybe it's not even a viable place to store that dataset as it continues to grow. However, I'm the only GIS person in the organization and I unfortunately don't have any experience with SQL servers or building datasets outside of "traditional" geodatabases or AGOL feature layers.

Additionally, the reason why I'm considering AGOL to store this is that this data will ideally feed into things like surveys in Survey123, Dashboards, etc. So having the data as a hosted feature layer seemed to make the most sense in that regard.

Lastly, I don't know how the potential script would perform on their end if, every time I ran it, it would essentially send 2,000+ API calls to their server.


I'm still in the exploration phase and I don't really know if this is even possible in the way I'm proposing, so any thoughts, feedback, or other ideas would be greatly appreciated.

Thanks for your time!

0 Kudos
0 Replies