POST
|
ArcGIS Enterprise already has arcpy in it's python environment, so why does Notebook Server running on the same machine require an expensive Advanced license to use arcpy? This license seems like a redundant money-grab. Has anyone got a workaround for using arcpy in docker without paying an extortionate amount of money for the Advanced extension? It's built into Pro, but I've had trouble keeping Pro perpetually signed in. Also I'd prefer to run scripts in docker since it is more or less isolated and impervious to System Admin tinkering and Windows updates. What about Notebook on AGO? Will that be able to support arcpy and consume non-AGO-hosted services (e.g. arcgis enterprise hosted feature services) as tool parameters?
... View more
10-13-2020
08:54 AM
|
0
|
0
|
618
|
POST
|
When I do a google or bing search for my organization (ecgis), I get search results for my hub and home pages but they are named "ArcGIS" instead of the organization's name. How can this be fixed? Also the default organization home page result says "We would like to show you a description here but the site won’t allow us." Is there a way to fix that too?
... View more
09-25-2020
12:16 PM
|
0
|
0
|
471
|
POST
|
Yes! This got my connection speed down to 3 seconds! Thank you!
... View more
09-10-2020
10:36 AM
|
1
|
0
|
3072
|
POST
|
Update: As suggested above, for me removing the gp history from db worked wonders. We also have this issue. We're currently running 10.7.1. The initial connection to the MSSQLSERVER DB (LGIM schema) takes 12 seconds. After that, opening a feature dataset takes an additional 7 seconds. It isn't a show stopper but the problem didn't exist in the 10.3 era. I've used UNC and DNS in my connections and there was no difference. I also tried compressing and rebuilding indices. That was interesting because the initial connection actually took 2 seconds longer but opening a feature dataset took 2 seconds less, so it was a wash. QGIS connects relatively quickly and is opening each dataset from the get go in about the same amount of time arc takes to make the initial connection. SSMS opens the database in under a second. I suspect it isn't a database issue. It is something that was introduced in later Arc versions (maybe at 10.5). Maybe a look at release notes between 10.3 and 10.5 might provide some insight. IDK. Further testing showed that if you recursively list all feature classes in the database (regardless of whether or not they reside in a feature dataset) using arcpy it takes 41 seconds. So apples to apples, QGIS is connecting to these twice as quickly and SSMS is exponentially faster. I'd be kind of interested in seeing a comparison between 10.3 and 10.7. I might be curious enough to try it. I'll post results here if I do. The takeaway for me is that it's looking more and more like Arc is responsible for the delay. Here's a script to test for yourself. The first time you run it, it will take awhile. Run it again right after that. It will run twice as fast because the initial handshake has already been made. At the very least, you'll be able to quantify the initial connection lag for your specific DB so you'll have a baseline. import os, arcpy, time gdb = r'%AppData%\ESRI\Desktop10.7\ArcCatalog\yourconnection.sde' start = time.time() arcpy.env.workspace = gdb datasets = arcpy.ListDatasets(feature_type='feature') datasets = [''] + datasets if datasets is not None else [] for ds in datasets: for fc in arcpy.ListFeatureClasses(feature_dataset=ds): path = os.path.join(arcpy.env.workspace, ds, fc) print(path) end = time.time() elapsed = (end - start) print(elapsed)
... View more
09-10-2020
08:06 AM
|
1
|
0
|
3072
|
IDEA
|
It would be nice to have temporal overall usage reporting available both on-screen and downloadable as a csv in AGO. There doesn't seem to be any way to achieve this other than visit every item individually. There are two very compelling reasons to do this. First is to see total usage and use it as a way to justify cost of ownership. Second is to keep track of which content is trending and which is stagnant or unused. It is kind of amazing that this doesn't exist since item-level usage reporting is pretty robust. You can make a report csv, right now, but its limited. You can't do any of the temporal reporting you can at the item level and there are no charts, graphs or on-screen presentation at all.
... View more
08-22-2020
05:59 AM
|
2
|
0
|
468
|
POST
|
Is there a way either in Pro, ArcMap, AGO or arcpy that classifications can be automatically updated when the data changes? I have a workflow that has daily data changes and as this occurs the upper data value typically increases. That means the highest value is no longer symbolized because it is out of range. I know there's multiple ways to manually do this but I want to have the classification update when the new data comes in without having to manually change it and republish the map service each time.
... View more
07-14-2020
05:29 AM
|
1
|
0
|
532
|
POST
|
Maybe Qfield if you're running a supported database. I think it has online and offline modes. You might then need to setup a python script to get synced to portal. The Leica software works ok for me. I dropbox the data from the field. If you really wanted to you could just dropbox it to a pc and have a python script running that sniffs your dropbox folder and if there is new data, performs a sync task. That's kind of a long way to go though unless you're good with python and enjoy tinkering with automated tasks. One caveat though is that the OS on the Zeno is so old it might not support modern apps. Even if it does, they likely won't be in the Amazon app store. There were more bad decisions made with the 20's design than a bachelor party in Vegas.
... View more
07-02-2020
06:45 AM
|
0
|
0
|
1653
|
IDEA
|
I love this idea. Currently, when building an app I use a URL redirect from our own web server. That works for keeping links the same on our page but it doesn't help for folks who have visited the app and then bookmarked it.
... View more
06-30-2020
10:01 AM
|
0
|
0
|
4272
|
POST
|
Hi Graham, Having been down this road, the best solution for us was to use a GG04 smart antennae instead. The second best route is to use the Zeno with the Leica software instead of Collector. It now has the ability to add AGO layers though I'm not sure if those are directly editable. They might be. Even when the TLS proxy fix was configured, we had some performance issues trying to use Collector. There's an ArcGIS add-in available to push your feature classes' schema to Leica, if disconnected editing is an option. I'm a little disappointed that Leica is still selling these units. The 3G modem isn't really viable anymore and using Collector is pretty much a no-go. Hit up Jason Hooten from Leica, he can probably offer some advice and alternatives.
... View more
06-11-2020
05:25 AM
|
0
|
1
|
1653
|
POST
|
Thanks for that. Much appreciated. I'm not all that familiar with arcade yet so I'm thinking the best approach for me is to add a previous count field to the hosted feature class. Next, I'll start doing my edits in Pro rather than on AGO directly so all edits are sent simultaneously when I save them. Before I do my daily edits to update the current counts, I'll just do a field calc on the previous count field, pushing the current values over. Then I update the current case counts. The indicator should then (in theory) work because I can provide an attribute value as a reference. Takes a few more seconds to do it that way, but it should be foolproof, I hope.
... View more
05-29-2020
09:39 AM
|
1
|
0
|
488
|
POST
|
Thanks. That could work if I can conjure an expression that'll calc the difference between the most recent and second most recent rows. Maybe create sorted lists in python and grab the first two values and subtract them? Does that sound about right?
... View more
05-28-2020
11:58 AM
|
0
|
4
|
2487
|
POST
|
Has anyone successfully gotten the indicator previous value reference to work right? It keeps giving the current value as the previous value. In my dash, I have data that gets updated once daily. When the data is changed (edited right on AGO) the indicator value changes. Unfortunately, so does the reference and it reports the new value as the previous value. I saw a bug out there about this and fixing it is listed as "not in the product plan".
... View more
05-28-2020
11:32 AM
|
0
|
0
|
779
|
POST
|
I figured out there's a bug in the previous value reference within the indicator widget and fixing it is "Not in current product plan". The data that would be easiest to use is hosted on AGO so a scheduled task seems unlikely without getting into a lot of API stuff. Another dataset is hosted from an internal server that could maybe be query layered into the map service as a table. I was looking for more of a "set it and forget it" solution like the indicator would be if they'd fix it. Out of curiosity though, what kind of field calc would be needed? Would it be a one time thing or would it have to be recalculated every day?
... View more
05-28-2020
10:45 AM
|
0
|
6
|
2487
|
POST
|
The previous value setting didn't work for me after all. The data changed but the percent change remained zero.
... View more
05-28-2020
06:30 AM
|
0
|
8
|
2487
|
POST
|
Thanks, Xander. I think I can tease out the most recent report using max(date) but then I need to compare it to the second most recent report. Since I can't just say "yesterday" (reports don't happen every day), I'm stuck there. I imagine it must be possible to sort the dates somehow and choose the second most recent to use as a reference to compare to the most recent. I just can't seem to figure out how that would be accomplished in arcade. Maybe I have to do this calc on the backend with Python? I was hoping to avoid schema changes though, if I can. Update: Now I see what you mean with the indicator. It has built in field {} calcs that refer to the previous value. That in conjunction with conditionally formatting for increases and decreases will work fine. Thank you!
... View more
05-26-2020
01:23 PM
|
1
|
9
|
2487
|
Title | Kudos | Posted |
---|---|---|
1 | 04-06-2018 10:32 AM | |
1 | 01-03-2020 06:26 AM | |
2 | 08-22-2020 05:59 AM | |
1 | 07-14-2020 05:29 AM | |
1 | 11-04-2019 10:14 AM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:24 AM
|