POST
|
Running the script does not cause a crash, and there aren't any noticeable problems with the mxds that don't output information- I have not used MXD doctor but will give it a try against some of the mxds and try to include what information I can. From the initial runs of the tool against the problematic mxds the tool did not find any corrupt objects, my thoughts so far are that there is something about the data connections or structure of the layers within the problem mxds that the arcpy method does not like. The first thing I did was separate the functional from non-functional files into two separate directories-- to try and observe any major differences between them, there is nothing I have found so far. I will continue trying things on my own- will post anything that I find. Thank you for the response.
... View more
11-22-2019
01:49 PM
|
0
|
0
|
444
|
POST
|
I have put together a python script which iterates through a directory of mxd files and outputs their paths, layers, fields, and descriptive information to a csv file. When I run the script, it successfully outputs the desired information for about half of the mxds in the directory, but does not output anything for the others. After breaking the code apart and running it, I can confirm that the paths to these 'broken' mxds are being added to my initial lists with the others, but for some reason the arcpy.mapping.MapDocument(mxd) method is not able to retrieve their information. (I am passing the mxd paths to this method) My Main question is: what are some potential reasons as to why this method is working for some mxds but not others? The problem might also lie within the arcpy.mapping.ListLayers(mxd) method. here is the section of the script where this is happening: (ignore the indentation..the script runs but formatting is lost when I copy it over) #Iterate through the list of ArcMap Documents... for mxdpath in mxd_list: #mxdname = os.path.split(mxdpath)[1] try: mxd = arcpy.mapping.MapDocument(mxdpath) #Iterate through the ArcMap Document layers... fields =[] layers=[] for layer in arcpy.mapping.ListLayers(mxd): layers.append(layer) fList = arcpy.Describe(layer).fields for field in fList: fields.append( "Field = " + field.baseName) layerattributes = [mxdpath,' ', layer.longName,' ', layer.dataSource,' ',fields] #Write the attributes to the csv file... writer.writerow(layerattributes) except: arcpy.AddMessage("EXCEPTION: {0}".format(mxdpath)) del mxd there are exceptions thrown when the script encounters the mxds in question. through visual studio the only information I can find is: <entry> <record>728</record> <time>2019/11/21 22:40:32.740</time> <type>Error</type> <source>Editor or Editor Extension</source> <description>System.ArgumentOutOfRangeException: Index was out of range. Must be non-negative and less than the size of the collection.
Parameter name: index
 at System.ThrowHelper.ThrowArgumentOutOfRangeException(ExceptionArgument argument, ExceptionResource resource)
 at Microsoft.NodejsTools.Repl.ReplOutputClassifier.GetClassificationSpans(SnapshotSpan span)
 at Microsoft.VisualStudio.Text.Classification.Implementation.ClassifierTagger.<GetTags>d__5.MoveNext()
 at Microsoft.VisualStudio.Text.Tagging.Implementation.TagAggregator`1.<GetTagsForBuffer>d__39.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
 at Microsoft.VisualStudio.Telemetry.WindowsErrorReporting.WatsonReport.GetClrWatsonExceptionInfo(Exception exceptionObject)</description> </entry> I'm not sure how to interpret this, as there is nothing that I can discern to differentiate between the mxds which are working and those which aren't. And to be honest I'm not even entirely sure if this error is something related to the mxds or something to do with visual studio. Any information is appreciated, unfortunately I cannot provide any info regarding the specific mxds as they contain confidential enterprise info, but if anyone has ideas I am happy to provide non specific info (cache or not?, size? etc..) Thank you.
... View more
11-22-2019
09:43 AM
|
0
|
2
|
571
|
POST
|
Okay, So I have tried running the script many times now pointing to the .xml tile schemes output by the tool, and it is not working as intended. I am not ruling out the possibility that I am not configuring the Generate Tile Caching Scheme tool properly, but I have tried using many different parameters and I am still experiencing the same issue. (in certain configurations the tiles do not appear at any scales) I noticed that in the create vector tile package tool documentation that when the service type is set to existing, that the parameter can be either an xml tile cache scheme, or an existing vector tile service. So I used the URL pointing to the vector tile service that was created through the UI (the one that zooms to all levels) as a parameter to build the vtpks in the script. When I ran the script, it did begin to build the vector tile package but then failed with the error :ERROR 000622 cannot set input into parameter max cached scale. I am going to keep trying with the generate tile cache tool, but I will admit that at this point I am at a bit of a loss. I am confused as to why the UI output works fine, but the python method fails to build all scales even when they have the same input parameters. Is there a chance that this is a bug?? As always any information is appreciated Thank you. -.Jeff
... View more
09-12-2019
10:44 AM
|
0
|
0
|
1209
|
POST
|
Earl, Thank you for replying. you are correct, the problem has to do with tiles not being generated at zoomed scales, upon inspecting the details of each item in portal this is what I found: the left is the vtpk generated from the UI, it builds tiles all the way down to 1:35.27. The right is the vtpk generated from the script, which does not generate tiles past 1:564. So now my question is: what can I do to get the vtpk to build correctly through the python script? (can I use EXISTING, instead of ONLINE?, if so how?). Again any input is appreciated. Thank you.
... View more
09-10-2019
01:00 PM
|
0
|
2
|
1209
|
POST
|
I have put together a python script which creates vector tile packages and publishes them to our portal. Everything appears to work fine when zoomed out past about 1:282, but upon zooming further in the data does not display at all. The application in which these packages are used requires that they be visible when zoomed far in. I have tried to create the vtpk interactively through the ArcGIS pro UI, and it works!! - with a minimum cache scale of 1:295,828,764 and a maximum cache scale of 1:35.27, but I need it to work through the python code because we need this updated nightly and it cannot be a manual process. this is the CreateVectorTilePackage method call from the script: arcpy.management.CreateVectorTilePackage(map,outputPath+map.name+".vtpk" , "ONLINE", "", "INDEXED",1000000, 35.27,TileIndex,"Vector Package for application ","Vector Package for application") If anyone has encountered similar problems or has any advice it would be appreciated!
... View more
09-06-2019
03:20 PM
|
0
|
4
|
1441
|
POST
|
Andrew, Thank you for your reply , I am still not sure exactly what the problem was with trying to view the slpk in portal, but we recently updated to ArcGIS pro 2.4 and the problem is gone. I just figured I would let you know that the 2.4 update corrected the issue. Thanks again, -Jeff.
... View more
07-09-2019
03:04 PM
|
1
|
1
|
496
|
POST
|
I am attempting to publish LiDAR data in a point cloud scene layer package to our R&D GIS portal. I have tried a some workarounds to correct this problem but it persists. When I share & publish the .slpk file, it uploads successfully and provides a thumbnail image, but when I attempt to view it with scene viewer in portal I encounter a message telling me: Layer cannot be added User does not have permissions to access 'hosted/srplidar_extract_lasd.sceneserver'. I figured that this was a permissions problem so I logged into our portal as admin and the error message changed to : Layer cannot be added Service not found I have made sure that the data is in the correct coordinate system and projection; I have read some documentation regarding issues like this stemming from browser security settings on the host server, but I am hesitant to change any of these without having a clear picture of the problem as they are potentially a security issue. Any advice is greatly appreciated, and I can provide more specific info if anyone has any ideas. Thanks.
... View more
06-11-2019
04:00 PM
|
0
|
3
|
679
|
Title | Kudos | Posted |
---|---|---|
1 | 07-09-2019 03:04 PM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|