I have mosaic datasets that reference a significant amount of imagery and elevation data. The mosaic datasets are shared as an image service on an arcgis server in a dmz. I can publish an image from my internal computer, but to get large amounts of data onto the server I need to send a drive to our IT department in another state, they copy it over, then i can login to the server and move it where it needs to go.
Is there a way to push only new mosaic dataset items and update the image service?
I have all the data on my local machine, and get new data on this machine. I have the folder on the server registered. But it seems that when I create a service definition file that either no data is copied (because it assumes that since the folder is registered the data should be there) or if the folder isn't registered it plans on copying up all data. I can login to the server and move data around, so I could publish each image as a new service, go in and then move it into the main image service, but that seems like more work.
Essentially I'd like the system under the hood to say
1.) What data is on my local system?
2.) What identical data is on my registered folder on my server?
3.) What do I need to publish to my registered folder to make it look like my local system?
4.) Publish the new data.
Preferably in Python.
Any ideas would be appreciated.