I very happily tried to publish as a service the following model and run very quickly into trouble. What it does is basically taking a zip file with a shapefile, import it into an sde database (SQL server) merge it with an existing featureset and delete intermediate products. It works perfectly on my desktop but after publishing the results as a service the "Import to database" gives the following error "ERROR 000210: Cannot create output C:\arcgisserver\directories\arcgissystem\arcgisinput\TestFolder\Model5.GPServer\extracted\v101\sql06.sde\temp1 Failed to execute (Import to Database). Failed to execute (Model3). Failed to execute (merge_layer)." The database is registered with the AGS (10.3) and when I publish I don't get prompted to copy anything.
Any ideas why AGS can't write to the database?
Check all your directory permissions, SDE privileges, model parameters and pathways, tool input and output parameters. GPs that run well on the desktop with physical output locations do well, but when publishing as a GP service locations must be reset to in_memory so the input and jobs directories can access them. Lastly, back your model up to the first point of failure and solve from there
Thank you David,
You helped me put things in order. I managed to solved it. There was a number of things going wrong:
Here is the script that worked. It used to be a lot prettier but every time I try to beautify it, something goes wrong...
import arcpy
from arcpy import env
import zipfile
# Script arguments
Input_zip = arcpy.GetParameterAsText(0)
arcpy.env.overwriteOutput = True
fh = open(Input_zip, 'rb')
z = zipfile.ZipFile(fh)
for name in z.namelist():
z.extract(name, arcpy.env.scratchFolder)
if name.endswith('.shp'):
result = arcpy.env.scratchFolder+"\\"+name
fh.close()
sdecon = arcpy.CreateDatabaseConnection_management(arcpy.env.scratchFolder, "tempcon", "SQL_SERVER", "xxxxxx.xxxx.xxxxx", "OPERATING_SYSTEM_AUTH", "", "*****", "SAVE_USERNAME", "GDB", "", "TRANSACTIONAL", "sde.DEFAULT", "")
outpath = arcpy.env.scratchFolder+"\\"+"tempcon.sde"
arcpy.FeatureClassToFeatureClass_conversion(result,outpath,"NEWFILE")
newfile=outpath+"\\"+"GDB.SDE.NEWFILE"
oldfile=outpath+"\\"+"GDB.SDE.MAIN"
renamefile=outpath+"\\"+"RENAMEME"
arcpy.Merge_management([oldfile,newfile], renamefile )
fromname = outpath+"\\"+"GDB.SDE.RENAMEME"
arcpy.Delete_management(oldfile, "FeatureClass")
arcpy.Delete_management(newfile, "FeatureClass")
arcpy.Rename_management(fromname, "MAIN")
Thanks again. I hope some of the above will help other user.
great, glad to help
I have ran into similar problems when publishing Geoprocessing scripts. Just because it published fine, does not mean that it will run fine.
When you publish the script, it (ESRI) will manipulate your python script and add its own interpretation of your variables / data locations / connection files. You have to look at the error messages to hunt down the actual script that it is running so you can either change stuff back or overwrite it with your original script.
For example, I had a connection file to a SQL database so it could GeoCode a View that I created. Even though during the publishing it made me add the connection file to the Server's Data Store, the new script doesn't even use the Data Store, and it copied the file and repointed my script to the ....\v101 directory. On top of that, it changed the connection file from a 'SQL Server' to an 'Application Server' and it doesn't even open in ArcCatalog.
Basically, just be aware that once you publish, it is not your original script anymore and you must start another round of trouble shooting.
Enjoy!,
Dave
Is it related to database user role and connection privileges.
Please check