POST
|
I was able to resolve the issue with some experimentation and further digging on GeoNet (though it doesn't appear many people have encountered this and posted up about it). A couple of threads led me to something that appears to be working: https://community.esri.com/thread/158172 https://gis.stackexchange.com/questions/75138/how-to-publish-gp-service-with-importing-of-numpy-in-arcgis-for-server-10-1 The first thread highlights a similiar issue with a GP service pre-publication analysis throwing a 00178 error. That thread referenced the second thread. Both identified an issue with how import statements were constructed in the python script. The solutions identified were to make more specific calls to individual functions, classes, etc., within the imported modules. In my script tool, I was importing arcpy like this: import arcpy Pretty standard, I would think. I replaced that line with specific imports to the modules in use: from arcpy import env from arcpy import GetParameterAsText, SetParameterAsText, GetParameterInfo from arcpy import AddMessage, AddWarning, AddError from arcpy import Raster, Describe, CreateUniqueName from arcpy import Intersect_analysis, Statistics_analysis, Select_analysis, AddField_management, CalculateField_management from arcpy import FeatureToRaster_conversion, RasterToPolygon_conversion, JSONToFeatures_conversion, FeaturesToJSON_conversion from arcpy.sa import Watershed, Con, IsNull, SetNull from arcpy.da import SearchCursor, UpdateCursor The GP pre-publication analysis process worked quickly and as expected: no bizzare attempt to import system executables or source data from non-existent directories. I'm not sure if from arcpy import * ...would have yielded the same result, but as this took several days for me to resolve I'm not inclined to try anything else. Hopefully someone else finds this useful. Cheers!
... View more
06-13-2016
01:17 PM
|
2
|
0
|
2074
|
POST
|
Unfortunately, no luck with the approach of using os.path functions to construct paths to scratch data. Same errors from Analyze as before.
... View more
06-10-2016
10:45 AM
|
0
|
0
|
2074
|
POST
|
I'm going back through the documentation to see if I missed anything critical. One thing I noticed here was this bit about how the script tool is scanned to discover any project data used in the script: "When your script is scanned, every quoted string (either single- or double-quotes) used in a Python variable or as an argument to a function is tested to see if it is a path to data that exists." I have a lot of quoted strings in my script: references to column names and field values; doc strings; info, warning, and error messages; parameters for arcpy tools (e.g,. arcpy.conversion.RasterToPolygon(input, output, "SIMPLIFY") ); and...some empty strings (used as placeholders in data objects for values to be written by another application). If this scanning process always happens, that might explain why Analyze takes 10-15 minutes to complete, and perhaps why the process treats "" as a location that data needs to be added from. Could this explain the errors? What I maybe don't understand is that I've followed the convention for parameterizing inputs and outputs described here. I think that is saying is that the script shouldn't be scanned if the inputs/outputs are all set as parameters, which I've done. --- Unrelated to the above - what I've also noticed is that all the examples in the documentation use os.path functions to manually construct paths to data, rather than using arcpy.CreateUniqueName() and/or the arcpy.env.scratchWorkspace variables to let ArcGIS sort it out. I will give that a go for my scratch data and see if that helps, as per my other comment. Thanks in advance for any insight, Christian
... View more
06-10-2016
08:06 AM
|
0
|
0
|
2074
|
POST
|
I downgraded to 10.3 desktop -- clean uninstall, wiped the user data directories, etc. Basically a fresh, unencumbered install of ArcGIS Desktop (first time in a while, and frankly it needed it). Unfortunately the issue appeared, same as before. But at least I can rule out a version incompatibility issue, and any other potential funkiness that may have been introduced through my normal.mxt or my other customizations. Thanks!
... View more
06-10-2016
07:38 AM
|
0
|
0
|
2074
|
POST
|
For all scratch outputs I have a small shortcut function that relies on arcpy.CreateUniqueName and the scratchWorkspace environment variable to handle filenames and locations: def so(temp_file_name, in_mem=False): """scratch output full path generator""" if in_mem: return arcpy.CreateUniqueName(temp_file_name, "in_memory") else: return arcpy.CreateUniqueName(temp_file_name, arcpy.env.scratchWorkspace) arcpy.env.scratchWorkspace = arcpy.env.scratchGDB The scratch workspace is set to the scratch geodatabase. The issue also presented itself when I wrote to the in_memory workspace, too. I will try swapping out the guts of this function for one that uses the system path construction functions instead, and see if that makes a difference.
... View more
06-10-2016
07:33 AM
|
0
|
0
|
2074
|
POST
|
It is - I originally wrote it using 10.3 desktop. That sounds as good a starting point as any though. I'll revert back to 10.3 and post up here the results. Cheers!
... View more
06-09-2016
10:28 AM
|
0
|
0
|
2074
|
POST
|
I am attempting to publish a geoprocessing tool (referenced as "Delineate Core" below) from ArcGIS Desktop 10.4.1 to ArcGIS Server 10.3 (for Windows) using the workflow described in the documentation, and am running into some rather bizarre errors when analyzing the service that are keeping it from publishing. They are as follows: SEVERITY STATUS CODE DESCRIPTION NAME TYPE DATA FRAME Error-High Unresolved 178 Data: C:\Windows\bfsvc.exe, used by Script Delineate Core cannot be copied to the server delineate Tool Unknown Warning-High Unresolved 24032 Data source used by Script Delineate Core is not registered with the server and will be copied to the server: C:\dev\null delineate Tool Unknown Warning-High Unresolved 24032 Data source used by Script Delineate Core is not registered with the server and will be copied to the server: C:\ delineate Tool Unknown I'm at a bit of a loss. bfsvc.exe is a system executable (note that it is the first encountered alphabetically in the Windows install directory). I don't have a "C:\dev\null" directory. And I'm not sure why it would be looking to copy my entire "C:\" drive. None of these things are referenced in my geoprocessing tool. The analysis process also takes about 15 minutes to run, which makes me think that something in either my script or the way that I'm publishing the service is causing the Analyze process to look all over the place for...what? I don't know. A rundown of my geoprocessing tool: The geoprocessing tool is a python script tool in an arctoolbox. The script doesn't have any third-party python modules dependencies, only arcpy and a few modules from the standard library. All 7 inputs are parameters (i.e., using arcpy.GetParameterAsText(n)); none are hard-coded file paths, and the script contains no hard-coded file paths. When I publish the tool, I set 5 of the inputs as constants, as the data comes from a file geodatabase (which is also registered with the ArcGIS Server). There are 2 output parameters: one is a string, and one is a json file. Output parameters are set using arcpy.SetParameterAsText(n). The arctoolbox and python script live in the same folder as the registered file geodatabase mentioned above: "c:\ags". I attempted a publication workflow with the tool located on my "D:\" drive. I didn't get error code 178, but still got references to a non-existent "D:\dev\null" location and the entire "D:\" drive, similar to the table above. I gave publishing a go in that case (why not, since no errors where holding it back), but the it hung up. Where do I continue troubleshooting? Many thanks in advance, Christian
... View more
06-09-2016
09:36 AM
|
2
|
9
|
5671
|
POST
|
I had a similar question and came across this: ArcGIS Server Link for Google Maps API: Examples, which looks like it was later moved to Github: v3-utility-library/arcgislink at master · googlemaps/v3-utility-library · GitHub It hasn't been updated in 3+ years, but the examples still work.
... View more
02-12-2016
08:28 AM
|
0
|
0
|
2623
|
POST
|
I thought maybe the workspace was corrupted (I created it over 2 years ago with 2010.3), so I re-installed CE and created a new workspace. I imported the old project into the new workspace, opened it up, and it the memory dropped from 1700 to 140 and hung while loading the scene. I only noticed this happening after developing a cga that uses a lot of parameters from an imported rule, like this:
import iR: importedRule.cga
iR.rule(attr1, attr2... attr36) --> nextRule(attr1, attr2... attr36) If I comment out the rule (in a text editor outside CE) and and then restart CE, the scene loads up without issue. If I un-comment the rule while in CityEngine, I will be able to work with the scene for some time, but then the window I described appears unpredictably upon a random save. That's really the only thing I can think of that I've done out of the ordinary since this started happening. Any insight appreciated.
... View more
02-15-2013
06:28 AM
|
0
|
0
|
334
|
POST
|
Is there a bit of code that can tell me which way is up and down on a surface? I know you can assess slope angle of a surface can it be made to tell me the direction? If you are working with a raster topographic surface and have access to Spatial Analyst... a less dynamic but workable option would be to use the Aspect operation in spatial analyst to create a raster where cell values indicate direction of the slope in degrees (0-360, with 0 = North). The direction the operation provides is for down slope; up slope could be obtained by adding or subtracting 180; cross slope (perpendicular to down or up slope) by adding or subtracting 90. You could bring in one or several aspect rasters ('downSlope', 'upSlope', 'crossSlope') into CE as map layers. Then use a cga rule applied to the objects to: - get the orientation in degrees from desired map layer (the map layer used could be exposed as an attribute) - use the value from the map layer to rotate the object the number of degrees that would make it face downSlope, upSlope, or crossSlope. This would require the imported objects to all be facing north, prior to reorientation.
... View more
02-14-2013
02:38 PM
|
0
|
0
|
579
|
POST
|
Recently I've encountered a pop-up window that shows a list of tasks akin to the progress window (screenshot attached). It sometimes appears after I save all. It says: The user operation is waiting for "Building workspace" to complete. Once this appears, a few seconds later the free memory ( the java memory side) drops and I get the purge memory question. After that, the progress bar doesn't move, though the windows task manager shows my CPU maxed out by CE. The only way out is to end the process. I'm not working with a particularly large scene (<8mb) or building complex 3D geometry; I am working with several cga files that pass back and forth a few dozen attributes (didn't notice the issue before I created that cga). 64-bit CE, 8GB of RAM... Is this a is a memory issue or a matter of rebuilding the workspace?
... View more
02-14-2013
02:16 PM
|
0
|
3
|
1123
|
POST
|
Rather than keeping building types in a single cga file, it might be easier to split them into separate files and import the cga file for the appropriate building type into a master.cga. The imported building type cga file can be selected conditionally, based on the value from raster constraint map layer. It will keep the code a bit neater and make it more expandable (for more building types. I'm working on a similar problem (though hadn't considered using styles); you may find the discussions in this thread informative.
... View more
02-06-2013
02:44 PM
|
0
|
0
|
226
|
POST
|
I try to pass down all values needed from the master rule down to the slave rules, if somehow possible. ^I'm finding this part of it a bit challenging still. I'd like to work towards using the master to distribute attribute values, but I'd also like to: - store attributes and some logic in several cga files (we'll call them 'attributeStorage'; in my actual work this would be a rule file that contains all the dimensions and conditions described in a single land use zone) - use some logic in a master cga file to identify which attributeStorage file to reference the attributes from - use those attribute values in another rule This is a simplified version of some test code where I've tried to do this, using the basic structure from Matt's example:
###############
#master.cga
import as1: "C:\filepath\attributeStorage1.cga"
import as2: "C:\filepath\attributeStorage2.cga"
import mr1: C:\filepath\massingRules1.cga
import mr2: C:\filepath\massingRules.cga
attr master_height = 0 #empty
startRule -->
[INDENT]case someLogic: [/INDENT]
[INDENT][INDENT]set(master_height, as1.stored_height) as1.startRule[/INDENT][/INDENT]
[INDENT]case someOtherLogic:[/INDENT]
[INDENT][INDENT]set(master_height, as2.stored_height) as2.startRule[/INDENT][/INDENT]
[INDENT]else: doNothing.
[/INDENT]
as1.as(type) --> as(type)
as2.as(type) --> as(type)
as(type) -->
[INDENT]case type == "type1": mr1.startRule
case type == "type2": mr2.startRule
else: doNothing.[/INDENT]
###############
#attributeStorage1.cga
attr stored_height = 10
attr type = "" #empty
startRule -->
[INDENT]case someLogic: set(type, "type1") as(type)
case someOtherLogic: set(type, "type2") as(type)
else: doNothing.
[/INDENT]
###############
#attributeStorage2.cga
attr stored_height = 20
attr type = "" #empty
startRule -->
[INDENT]case someLogic: set(type, "type1") as(type)
case someOtherLogic: set(type, "type2") as(type)
else: doNothing.
[/INDENT]
###############
#massingRules1.cga
attr master_height = 0 empty #overwritten when imported by master.cga (?)
startRule -->
[INDENT]extrude(master_height)
Massing.[/INDENT]
###############
#massingRules2.cga
attr master_height = 0 empty #overwritten when imported by master.cga (?)
startRule-->
[INDENT]extrude(master_height * 0.5)
Massing.[/INDENT]
The processing roughly works like this: 1. Master.cga contains applies some logic to determine which of the imported 'attributeStorage' cga files to reference. 2. The master cga resets an empty attribute based on the corresponding attribute in the imported 'attributeStorage' cga file. 4. The selected attributeStorage rule applies some logic to set another parameter 5. The master.cga looks at that rule passed back from attributeStorage to apply some logic to select which massingRule to reference. 6. The massingRule uses the parameter that was reset in step 2 to extrude. This does work...but only really when I only have to pass one attribute ("master_height") around. It gets very unwieldy and repetitive if I have a dozen attributeStorage cga files storing two dozen attributes each. To avoid trying to directly pass attributes I was hoping to take advantage of the fact that the attribute of an imported rule is overwritten by an attribute of an importing rule if they have the same name - but don't quite understand the conventions for preserving attributes (or not preserving them) as described in the help documentation for the the import statement. And while I can usually get the importing rule attribute to overwrite the imported attribute, I can't seem to use the convention correctly to preserve an attribute it so I can pass the the value to another imported rule. I have a feeling I'm making this more complicated than it needs to be. -Christian
... View more
02-06-2013
11:40 AM
|
0
|
0
|
774
|
POST
|
Thank you for that Matthias. Looping back to the master script to aggregate reporting and reporting parameters will definitely simplify things a bit. Working through this a bit more, it looks like my solution may need be a bit of a hybrid of both methods, if only to avoid the attribute name overrides that can occur when importing rules. Though I think I've worked out some logic to include within my rules that avoids the issue. I am working with essentially two types of functions: those used for analyzing the geometry of models to adjust how massing occurs mid-rule (which do not need to have any parameters exposed in the inspector), and those used for reporting (which don't have exposed parameters at the moment, but certainly could, (e.g., adjustable population-per-dwelling unit assumptions)). So I think my solution is to: -store the geometry-analyzing functions in one cga, and import that into the "imported" cga's, as ChrisWilkinsGeo suggested. -for information to be reported, bring it back to the master.cga, so that the parameters can be managed easily in the inspector, and then report it through the rules executed in another imported cga (containing reporting functions). ... Well...all that may not be articulated very well, but it is clear(er) in my head now anyway, and my mock-up seems to be operating as expected. I'll experiment a bit more and draw up a diagram to share once I've sorted it. -Christian
... View more
02-02-2013
04:07 PM
|
0
|
0
|
774
|
POST
|
Ah - that worked perfectly! Many thanks for the response and example. I have a much better understanding now of what the import statement can do. The result is attached. The 'imported' cga files mass up the building envelopes for different land uses, with estimates of GFA, population, parking requirements, student generation, etc.: [ATTACH=CONFIG]21321[/ATTACH] -Christian
... View more
02-01-2013
06:44 AM
|
0
|
0
|
774
|
Title | Kudos | Posted |
---|---|---|
2 | 06-09-2016 09:36 AM | |
2 | 06-13-2016 01:17 PM |
Online Status |
Offline
|
Date Last Visited |
11-22-2021
10:34 PM
|