|
POST
|
My organization has stood up a test/dev environment running ArcGIS Server/Image Server 10.9. Wanting to test the functionality of an image service, I published a mosaic dataset. However, I am unable to select a folder to publish to (see below ... the folders do not show up). Typically our organization organizes map services by topic/department via folders, but I am not given that option with an image service. I will note that the service publishes just fine; it just publishes to the root. And also note that when publishing regular map services I am given the option to choose a folder. Anyone else experienced this with image server? I am wondering if it's a bug with 10.9 ... I may be upgrading to 10.9.1 soon so I will report back if there is still an issue.
... View more
12-20-2021
01:53 PM
|
0
|
1
|
1492
|
|
POST
|
Bummer, I was hoping there would be an arcpy function to test if a GDB is accepting connections. Maybe ESRI can include that functionality in future releases?
... View more
12-14-2021
06:51 AM
|
0
|
0
|
2836
|
|
POST
|
My organization actually just migrated our scripts to a dedicated server. Before we had a similar setup to what the OP probably currently has; a server with an enterprise GDB/SQL and scripts set in Task Scheduler. In an effort to segregate separate GIS tasks/workflows, we stood up an automation-focused server. Honestly it's set up the same as we had it before ... scheduled tasks are still set up in Task Scheduler. No real issues with the new server setup. In fact scripts do run slightly quicker. Some of that might have to do with the newer operating system we migrated to. As for Notebook server, that's something I have been interested in exploring, but don't have any experience with at the moment. I'd also be interested to hear how other organizations are using it for their day-to-day operations.
... View more
12-13-2021
09:56 AM
|
2
|
0
|
1254
|
|
POST
|
Thank you for the reply. We already use the AcceptConnections function to block or accept connections in our database maintenance scripts. What I'm looking for is a way to check if connections are blocked/open.
... View more
12-13-2021
09:49 AM
|
0
|
3
|
2879
|
|
POST
|
Our organization has a database maintenance script that runs nightly. Without going into too much detail of the script tasks, the first task is to stop any connections to the GDB and the last step is to open connections back to the GDB. While it is rare, once a blue moon the script will stop during a run and not allow for database connections. We usually find this out when the end users are prompted with an error message the next morning and we have to manually check the box to allow for connections. I should note we do have safeguards in place if the script runs into any geoprocessing errors; the script will send admin users an email message of the error that was encountered and reestablish database connections as a final step. However, there are the strange instances where the script will stop completely with no errors, thus not taking that final step to accept connections. My questions is: is there a way using python to check if an enterprise geodatabase is accepting connections? Ideally I would like to write a simple script that performs the following tasks: 1. Check if GDB is accepting connections 2. If yes, end script 3. If no, send an alert message to admins and accept connections
... View more
12-13-2021
09:19 AM
|
1
|
5
|
2899
|
|
IDEA
|
I tried testing the approach above and could not get it to work. Tried going from line to polygon and vice versa; doesn't allow for import. Suppose it makes sense because the symbology for lines and polygons are completely different ... but I figured I would give it a shot because like the OP mentioned, it's quite annoying to match colors between layers with 100's of records ...
... View more
09-16-2021
06:43 AM
|
0
|
0
|
5638
|
|
POST
|
Was using a 10.5 machine and needed a basemap; encountered the same error message. Running PatchFinder indicated I was definitely missing the TLS patch. Installed it and error went away. 👍
... View more
04-27-2021
05:13 AM
|
0
|
0
|
11850
|
|
POST
|
Thank you for the advice, I'll make a note of the machine ID in case it happens again. I should note that I had a call with ESRI support a couple weeks ago and we went through the same steps as I had outlined above; and it hasn't broken since then.
... View more
01-04-2021
10:46 AM
|
0
|
0
|
1591
|
|
POST
|
Apologies for the delayed response, what with the holidays and other projects going on we got somewhat sidetracked. To answer your question, we don't have any database views in our eGDB, so we can rule that out. In fact, we actually went ahead and implemented the 'list approach'; basically just have a list of feature classes/datasets/tables hardcoded in the scripts that the analyze/rebuild index tools will run against. While admittedly not the best approach, it did cut back on the nightly processing time. One of the main issues is keeping up with the list to have the script run against; sometimes it's a bit of a guessing game with some of the lesser edited layers. But I'm thinking something like this database summary dashboard may assist us with that task.
... View more
01-04-2021
10:43 AM
|
0
|
0
|
4265
|
|
POST
|
Our organization acquired a concurrent use CityEngine license earlier this spring, but we've been encountering errors with it numerous times with regards to the licensing. It appears after a certain amount of time, the license on the server hosting it becomes corrupt. For instance, if an end user tries to access CE on their desktop, they will be presented with a message that states there are no licenses available. Looking at the server hosting the license, in License Server Administrator, we are presented with the message below. You can also see that under the Available column it states "No". The only resolution I've found is to delete the FLEXnet folder on the license server > recover the lost license via My ESRI > and reauthorize the license (per this tech article). This has happened to us around 3 or 4 times and the resolution is not the most elegant; there is obviously some issue going on that corrupts the license but I cannot figure it out (yet). Just was wondering if other uses had encountered this issue before? If so, what was your resolution?
... View more
12-21-2020
12:11 PM
|
0
|
2
|
1628
|
|
POST
|
We ran the script again last night; took 20 more minutes than the previous run the night before. We definitely need to look into a way of optimizing the task. To address some items that you mentioned, we do indeed use the Rebuild Indexes tool. As for the testing of Remove Spatial Indexes and Add Spatial Index plus Remove Attribute Index and Add Attribute Index, we simply just haven't had time to mess with/test those tools yet. At the moment, my colleague and I exploring the idea of using editor tracking to determine whether a feature class needs to be analyzed/rebuilding indexes. The only caveat with that is that we have a decent number of feature classes that get edited updated nightly via python script; those don't have editor tracking enabled. And we wouldn't want to overlook those layers. Suppose a quick solution to that problem would be to add them to a list of layers to always run the analyze/rebuild index tools against. In any event, I'll keep our progress updated on this thread. It may be beneficial for other organizations facing similar issues.
... View more
12-11-2020
12:06 PM
|
1
|
0
|
4411
|
|
POST
|
Did you register the feature dataset as versioned after renaming it? Not having it registered as versioned would produce that behavior. Also, was there a particular error message it produced when y'all tried to edit the feature classes within?
... View more
12-10-2020
11:18 AM
|
0
|
0
|
694
|
|
POST
|
Just looked at ArcMap; it does appear that Symbology and Labels default to the alias name. Not aware of any setting to change it the field name when using the Symbology/Labels tab. However, the Definition Query tab (and Query Builder) default to the field name. I know you didn't ask about it, but you can change the Attribute Table to display the field name rather than the default alias. Open the attribute table > click on the drop-down on the left > and uncheck the option for Show Field Aliases. Really the only way I am aware of to set ArcMap so you always see the field name would be to change the aliases so they equal the field name; not sure if you have the privileges to do so if you are working with an enterprise GDB.
... View more
12-10-2020
11:13 AM
|
1
|
1
|
3331
|
|
POST
|
In the past the script would generally take about 45 mins to an hour. However, after examining the script a bit closer this week, my colleague and I determined that the loop for the analyze and rebuilding of indexes was probably not working properly. Having made some adjustments to run the aforementioned tasks on all FC's, FDS and non-spatial tables, the runtime was well over 5 hours (rebuilding of the indexes was the one task that took the longest). Now granted this may be because analyzing and rebuilding of indexes was not being performed on some layers; perhaps we will (hopefully) discover a much shorter runtime tonight with our testing. I would greatly like to use 'if-then-else' logic to determine whether or not a layer has been edited, but like you mentioned, I am not aware of an examples to determine this. But in the meantime I suppose one workaround I could deploy is to use a list of those layers that are known to be edited on a regular basis and have those tools run against only those layers. Not the best solution if we don't include those layers that get edited rarely, but could still work.
... View more
12-10-2020
08:13 AM
|
1
|
0
|
4424
|
|
POST
|
Thank you for the reply. Glad you mentioned switching the order of where we have analyze datasets; after reading some resources over the past week my colleague and I planned on moving the analyze portion after the compress. And based on your response, it seems as though we are doing what you recommend: running analyze on all feature classes (regardless of where they live), feature datasets, and tables. While this basically covers everything in the database, one thing we've noticed is that this really affects how long the script runs. However, from what I understand, analyze and even rebuilding indexes should really be occurring on layers that have had edits performed on them. Just for clarification, we have hundreds of layers in our database. Some are edited heavily; many edits on a daily basis. And other layers are never edited at all (basemaps, if you will). Is there a way we could add some logic to our python scripts to run analyze/rebuild indexes on those feature classes/tables that have been edited since the last time the script was run? While the goal of our nightly task is to be as thorough as possible, I also don't want to waste time and machine resources with a script that takes hours to run.
... View more
12-10-2020
06:18 AM
|
0
|
0
|
4429
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 09-23-2025 06:39 AM | |
| 1 | 12-13-2021 09:19 AM | |
| 1 | 12-09-2022 06:41 AM | |
| 1 | 04-11-2022 08:19 AM | |
| 3 | 07-17-2023 07:44 AM |
| Online Status |
Offline
|
| Date Last Visited |
10-01-2025
09:07 AM
|