Geoprocessing service stuck on old feature schema

2540
0
08-14-2015 04:42 PM
by Anonymous User
Not applicable

I've encountered a lot of bugs in the Export Data Task that ESRI ships as a toolbox in arcmap (under Server Tools). I rewrote it to get around them so that it takes a string from my javascript web app of the layers I want exported, then dumps out a zipped folder of the files.

In the python script I hardcoded a shared folder that is registered with the server where all the .sde connection files live. In my javascript app I pass in the string of feature class names I want from an SDE, then python parses it to build the full SDE path and copy features from the SDE to the server, then deliver the zipped folder.

If I edit a feature in a feature class, or move it, no problem. The script grabs the edits/updates as soon as they are saved. However, if I add or delete a field, broken! The GP service somehow has an idea of the feature class' schema at the time of publishing and can only deal with those fields (looks for ones that are deleted, doesn't serve up fields that are new).

What! This GP Service takes a string as its only input parameter! How is it going through every SDE and taking a snapshot in time of the schema? And why just field names and not the extent? Or is it the SDE connection files that do that? I'm at a loss. All I want is the most up-to-date data when I run the data download tool from my web app...I thought directly connecting to the SDE and using CopyFeatures could accomplish that.

0 Kudos
0 Replies