Looking for suggestion for best practices regarding using Data Store and File Geodatabases with my Enterprise.
My environment: Enterprise 10.8 deployment. (Portal, Data Store, Server, etc)
Some of my data is kept in an Enterprise Postgresql geodatabase, but not all. (I use the enterprise geodatabase for versioning workflows and geodatabase replication)
Here's what I'm doing...
Currently, I'm constructing a file geodatabase based on nightly downloads from my county's GIS and tax dept. Basically, they provide updated shape files and text files that I end up massaging (clipping and re-projecting) and exporting that data via Python scripts so that the data I want resides in a unified coherent file geodatabase.
This data is not static and though the schema doesn't change, my scripts wholesale replace the feature classes in the geodatabase nightly. Let's call this the "county FGDB"
With this data, I want to create a "map image layer" (no need to edit this data, so doesn't need be be a feature layer) to use as reference layers in Portal web maps.
Seems I have 2 options... One is to create a map in ArcGIS Pro that references the data in my ever changing "county FGDB" from above. When I publish the layer I should reference the folder that holds my FGDB and ensure it is registered in my "Portal items" as a Data Store.
That way, if my "county FGDB" changes, (as it gets updated in my script) the "map image service" layers also update with the data. I do not have to republish the layers.
Kind of an easy way to deal with serving this data.
The other option is to add this data via script to my Postgresql Enterprise geodatabase.... and register it with my Portal. Same deal, if the data changes, the layer changes...
Question though... What will lend better performance? I've no idea how my FGDB is served with Data Store. Will there be a performance penalty to access the data from there?
I'm confused how my file geodatabase is interacting with the Data store. Seems weird. Magical.
Wondering if there is a bottleneck if many people start accessing the "map image layer". using this method.
Would I be better off, performance-wise, to keep this data in my Enterprise PostGresql geodatabase and publish from it?
I have no good reason to keep this data in my Enterprise geodatabase, if I do not have to. I won't be versioning it, it won't be replicated.
I have also toyed with the idea of importing my file geodatabase directly into my Portal, but that seems to only create hosted "feature layers", which I do not need. My users will not edit this data and I don't want to be impeded by the symbology limitations of "feature layers" vs "map image layers". And further, I'd have to script some process to replace the hosted feature layers from a newly minted geodatabase nightly... Seems possible, but maybe not exactly what I want to do, since there will be no map images layers.
Any advise or observations about my workflow? Is anyone else doing something like this?