I manage the GIS for an organisation that pulls data from various sources for use in our daily operations. Depending on the data source and required use, the data can be stored in various locations (e.g. Local server, AGOL or streamed via a url).
The challenge I face is tracking and maintaining all the datasets and being aware of where they are used (e.g. ArcGIS Pro templates, AGOL Field Maps / WebApps, Model Builder, python scripts etc....).
I have a couple of main concerns:
- Having full visibility of all our managed data sets and their sources
- Making sure contact points for each dataset in the organisation are working at all times. Especially if some aspect of a dataset changes – How do I easily identify all the places that dataset is used so we can update broken links and workflows.
I have visions in my head of a really tangled flowchart that starts with each data source, expanding into each dataset we use from those sources, funneling through models or flowing straight through to an end point where they are used by our organisation (e.g. many layers coming back together in a ArcPro Map Template for our users to access to make an operational map for a contractor)
I’m interested to know how other data managers deal with the challenges of managing a massive number of datasets and ensuring everything is maintained in good working order at all times.