Need a second thought on my architecture decision. I am setting up a straightforward imagery repository for my organization. We’re on a public cloud and have our own Enterprise Geodatabase. My key requirement is that users (most of whom aren’t very technical) need to view, pull and re-upload as a new imagery along with its metadata for each image in cloud storage.
What I’ve Tried/Considered
Mosaic Datasets – Initially I deployed a mosaic dataset, but it seemed to not link directly to cloud storage (s3, Google cloud storage), rather it requires a file network share (NFS) connection. My cloud-based architecture makes this tricky, and setting up a dedicated file share on-premise makes it complex. We are considering a cloud NFS solution (google filestore) as a last stage solution that all users would have access to.
Cloud Store connection – It’s convenient for storing large imagery files directly in an object storage like (Blob storage or GCS), but users can’t easily make changes or push edits back in real time. They’d have to download locally, update, and re-upload. We also lose easy metadata management, which is a big issue.
Direct Cloud NFS connections on GIS Editors PCs: Users can add connections to a cloud file share system in their windows computers that appear with their C:/ drive and read and write rasters to it. (couldnt test it)
Raster Datasets in Enterprise Geodatabase – This would allow a form of centralised storage, but it could massively bloat our database and potentially impact performance.
I dont need any image processing capabilities, just a way for users to pull required imagery on demand into their ArcPro environment as a base map or for basic processing in ArcPro like converting raster to vector e.t.c, and then also be able to upload new imagery or upload the edited image as a new image. However they should not be able to edit image in-place.
Please point out the best way forward.