|
POST
|
Hi Mark, Not in an unexpected way. Pro caches the info from a successful login, so a user doesn’t have to login every time they start Pro. Once the authorization times out, however, they will have to login again. That auth info is cached in the user’s profile, so if you’re not deleting it from the profile — or deleting that user’s profile itself —after the user logs out of the shared machine, then it will still be there the next time they login to use that machine.
... View more
01-06-2020
03:14 PM
|
1
|
2
|
1747
|
|
POST
|
Another 3D Analyst tool you might consider for tasks like this is Minimum Bounding Volume. I find Minimum Bounding Volume to be very convenient for converting points to volumes in the context of archaeological excavations. I use it to turn the points that define the often uneven boundaries of levels (or contexts, or stratagraphic units, or whatever your preferred terminology is...) into closed multipatch features (i.e., 3D volume representations.) I use ArcGIS Pro, though if you are a Desktop user, it also has a version of this tool, Minimum Bounding Volume - Desktop. As you've already done, the first step is loading the data from your Excel spreadsheet into a point feature class. You did not provide the coordinate system for your data; however, since it looks like an arbitrary example with units of meters, picking a coordinate system, like UTM, should do for this example. I arbitrarily choose to work with your data in my local UTM Zone. Otherwise, by default, ArcGIS may assume a coordinate system that treats your X and Y values as degrees longitude and latitude, and your Z value as meters. As you can probably imagine, it would then take a lot of vertical exaggeration for a few centimeters of elevation difference to show up in a 3D view of your Level, when it is being treated as one-degree in areal size. Not to mention that you might get some very unexpected volume estimates as well!) (I notice that your data seems to go up in elevation from your Start of the Level to the End of the Level, for point pairs with different elevations, rather than down, as I would expect for excavation. Your data will still work, however, it places your Start points below your End points, when you visualize it.) The second step is to generate multipatch features from each Levels' points. In your sample data you only have a single Level. Assuming your full data set contains more than one Level, then you would want to use the Level column to indicate which points should be grouped together to form an individual Level's volume. Because you have duplicate points for the same Level, you will see a warning message about such points being ignored in creating the multipatch feature. (When you have more than one Level in your data, then you should have points with duplicate coordinates, however, they will be associated with different Levels, so that won't interfere with each other; one Level's ending point is another's starting point, until you hit the bottom.) when running the tool, checking the box for "Add geometry..." will add an attribute, MBV_Volume, which will contain the volume of your Level. This is not an auto-calculating attribute, however, so if you edit your multipatch feature, then you will want to re-calculate this field (e.g., Update Feature Z.) Why would you edit your multipatch features? If you surveyed your "duplicate" points multiple times -- maybe it took more than a day to excavate a level, then the inaccuracies of the survey method will likely produce slightly different coordinates each time. So you might edit the multi-patch feature's vertices to average the difference, or to ensure one Level's vertices snap to an adjacent Level's. Also, because of those duplicate points, you will get an area that may not agree with how you are thinking of the area of this feature. Keep in mind that the whole "square" area is defined by the points you provided, however, more than half of the area the tool identified also has zero-thickness, and is, therefore, not part of the actual multipatch feature. If you visualize the resulting points and multipatch features in a Local Scene in Pro, then you can see the 3D shape of your Level. Something like this: In practice, levels/contexts/stratagraphic units can end up with some fairly convoluted shapes, so I would strongly recommend taking the time to visualize and explore the resulting multipatch features, before trusting their volumes (or areas). The "Concave Hull" method usually does the right thing, however, occasionally you may find that you need to manually drop some points that are duplicates or superfluous, which are confusing the algorithm relative to your expectations for the shape of the volume. Hope that helps!
... View more
12-14-2019
11:24 AM
|
0
|
0
|
3175
|
|
POST
|
Below is an example of the silent install options we use in many of our computer labs. Since you are using Named User licensing, you can point to your organization's ArcGIS Online instance, rather than the generic www.arcgis.com, to potentially save users steps logging in. This is especially handy if you have configured your instance to use Enterprise Logins for users, and configured Enterprise logins as your only login option. Then your users are sent straight to your single-sign-on dialog when they start-up Pro, rather than having to fill out Esri's generic login dialog and find their way to your organization. msiexec.exe /i <path to installer>\ArcGISPro.msi \ ALLUSERS=1 \ SOFTWARE_CLASS=Professional \ AUTHORIZATION_TYPE=NAMED_USER \ License_URL="https://<your org>.maps.arcgis.com" \ Portal_List="https://<your org>.maps.arcgis.com" \ CHECKFORUPDATESATSTARTUP=0 \ LOCK_AUTH_SETTINGS=TRUE One important note: make sure you specify the parameters in the order presented above! (The installer for ArcGIS for 2.4.2, and maybe any 2.4.x, introduced a bug whereby it does not properly parse the command line arguments, so re-arranging them can cause unexpected failures. For example, if you construct your command line in the order in which the parameters are explained in the documentation, then the installer will report success, however, the settings you specified will not be applied correctly, and you will get errors or unexpected behaviors when you try to login to Pro.)
... View more
12-11-2019
01:24 PM
|
1
|
4
|
1747
|
|
POST
|
Not sure if all this goofiness is still present in more recent versions API. The reason for it in this specific case was that the API was treating the values supplied in this way as strings, rather than as numbers. While numerically 000001 equals 1, "000001" and "1" are not equal. It also wanted values in microseconds, rather than milliseconds. Hence the additional factor of 1000 in the conversion. -peter
... View more
11-14-2019
08:02 AM
|
0
|
1
|
2576
|
|
BLOG
|
If you haven't already, I would recommend taking a look through the generic AWS guide, Changing the Instance Type. Sounds like you are interested in only adding more memory to your setup, and do not need to change anything else about it. In other words, you plan to stick with the same instance family, but want to use a different instance type within the family, one with more memory (e.g., maybe you want to migrate from an m4.xlarge with 16GB to an m4.2xlarge with 32GB.) I would strongly recommend that you do make a snapshot first, in case something goes wrong along the way, and you need to revert to your existing setup. The basic process for "adding memory" to an instance -- assuming you are using a typical EBS-backed instance -- is to stop your instance, change its instance type to a compatible one that has more memory, and then start the instance again. The steps are outlined in detail in the section of the AWS document titled, Resizing an Amazon EBS–backed Instance. Hope that helps, -peter
... View more
09-10-2019
06:42 AM
|
1
|
0
|
1846
|
|
POST
|
The current version of Docker Desktop for Windows, 2.1.0.2, only supports Windows 10. You need to obtain an older version, such as 2.0.0.3, which can still be installed under Windows Server 2016 or 2019. See Docker Desktop for Windows Stable Release notes for links to download older versions.
... View more
09-04-2019
12:09 PM
|
4
|
1
|
5629
|
|
POST
|
I am attempting to install Notebook Server with ArcGIS Enterprise 10.7.1. I've reached the part in Install Docker for ArcGIS Notebook Server where you install Docker Desktop on Windows. The Docker Desktop on Windows requirements, however, indicate that it is only compatible with Windows 10. Attempting to install it on Windows Server 2016 fails with the error, "Docker Desktop requires Windows 10 Pro or Enterprise version 15063 to run." Am I missing something obvious here? How does one complete the pre-requisite step of installing Docker, in order to install Notebook Server, if one is running Windows Server 2016 (or 2019 for that matter)? Thanks! -peter
... View more
08-31-2019
01:16 PM
|
1
|
2
|
16887
|
|
POST
|
I just verified that it still works correctly with our ArcGIS Online organization. Accounts created are enterprise accounts, not built-in, arcgis accounts. We have been relying on this method for awhile now, so I suspect the person with which you communicated at Esri may be mis-informed. For instance, our ArcGIS Online instance is configured to automatically join authorized enterprise users to our organization, so a user is able to do whatever they need to the first time they login. Sometimes, however, circumstances arise where we need to add an enterprise user to a specific group, before they have logged in that first time. As the user has not logged in before, their enterprise account doesn't yet exist in the system, so we cannot add them to the group. Therefore, we use we use the above method to create enterprise accounts, so that we can add users to the groups, even if they haven't logged in themselves previously. As I mentioned, if you don't supply all the parameters it is expecting, then you get unexpected errors. In same cases this means an account is created, but it is a built-in account, rather than an enterprise account. Give it a try.
... View more
08-29-2019
02:15 PM
|
0
|
1
|
3703
|
|
POST
|
If you would like to do this via the ArcGIS API for Python, then there is some basic info here: Creating new user accounts Unfortunately that doc does not include a complete example for the specific case of creating enterprise logins, however, I've found code like the snippet below works. (If you leave out parameters, like password, which it shouldn't need, you get unexpected errors.) new_enterprise_user = gis.users.create(
username = 'xxxxxxxx_umich',
password = 'None',
firstname = 'First',
lastname = 'Last',
email = 'xxxxxxxx@umich.edu',
role = 'org_publisher',
provider = 'enterprise',
idp_username = 'xxxxxxxx',
level = '2',
user_type = 'creator'
)
... View more
08-29-2019
12:59 PM
|
0
|
5
|
3703
|
|
BLOG
|
I like the idea of substituting a Story Map for the traditionally paper field guide for a field trip! Now I'm wondering if there is an easy way to take a Story Map "offline", to deal with field trips where the points of interest are in locations with no cellular data coverage? Is the a simple way to host a Story Map on one's mobile device? We've been using offline maps in Collector and Explorer to hold our field guide content, which we split up and attach as PDFs, image files, etc. to relevant locations on the map. We also have a master PDF for overview materials, which can be downloaded and read offline in the student's favorite PDF app. A Story Map for that purpose instead, still with links that would open the Collector/Explorer project on the same device to correct location would be great!
... View more
05-28-2019
06:09 AM
|
1
|
0
|
779
|
|
POST
|
I found the documentation and videos in the QuickCapture github repo very helpful for getting started. (You might miss the documentation links on first glance... there is a table of contents in the box on the right with, "Home, 1. Introduction, 2. Basic configuration, ..."
... View more
02-01-2019
11:11 AM
|
1
|
0
|
1766
|
|
POST
|
It is not well documented in any of the various ArcGIS APIs that support the time parameter for queries. You hunch is correct though, the time parameter works only with time-aware layers. Once you make your layer time-aware, then using time parameters with query will filter results based on the field(s) you've specified in defining the time-awareness of layer.
... View more
01-02-2019
05:43 PM
|
1
|
1
|
1104
|
|
POST
|
As noted ArcGIS QuickCapture can be configured to have just a single button. That might be the simplest way to go here. The iPad itself is probably too big to have easily accessible under the conditions you describe, however, you could use QuickCapture in conjunction with an iPhone (or other small mobile device) strapped to the user's forearm. Then as you're doing your task, you can easily reach and click on the button to record your current location. (Note that the device's battery can drain fast with this approach, as you need to leave the device on, so that the button is readily available to press. We carry an extra battery in this case, and charge the device during breaks to make it through the whole day; and we are using Collector or Survey123 directly, rather than QuickCapture.) If you just want a "button" -- AND, you can access WiFi where you are, rather than Bluetooth -- then another approach to consider is using an Amazon IoT Button. (Maybe you have a cellular iPad that can serve as a hotspot? Maybe you are already carrying a separate WiFi hotspot to provide connectivity for your iPad?) When pressed, the button's Lamda function runs, utilizing the ArcGIS API for Python (you could also leverage the ArcGIS REST API) to record information in ArcGIS. Meanwhile, your iPad is running Collector with a Location Tracking layer. (Note that I do not believe Location Tracking layers are supported in the new Collector yet, so we have had to stick with Collector Classic for this specific purpose for now; you can run both side by side on your device.) So when the button is pressed, the code that is run in the cloud grabs the most recent record in the location tracking layer, and appends it to another feature. Your code should also check the the button press timestamp and the most-recent tracking point are from about the same point in time, as the GPS might have lost signal and the location tracking layer isn't being updated at the moment. You can leverage the notification capabilities of the AWS button service to text you that your location wasn't recorded properly; though there is a bit of a delay with notifications, so you may find yourself having to walk back to the previous location to record it again. (One of our main uses for this has been for "I'm here now!" purposes over the course of a day. Think of it as a tracking layer filtered to record your location only when you manually choose to have it do so, and you don't want to fuss with having to pull out your phone and do something directly in Collector. That sounds pretty similar to what you're looking to do? Recoding the button press timestamps makes it easy to join/filter with all sorts of data after the fact too.) The IoT button works best if your need for location accuracy isn't too high. The iPad GPS, as you're already aware, can be off by quite a bit, especially under canopy. The delay of recording the button press and the temporal resolution of the tracking layer can add to the error too, especially if you move a lot during the delay it takes to process. Depending on your use case, however, that error source can be greatly reduced if you click the button at the start, rather than the end, of the task, so that you are "in the area" while things are being recorded, rather than walking or driving away to somewhere else. Hope that helps!
... View more
12-28-2018
10:21 AM
|
2
|
1
|
1766
|
|
POST
|
Hi Kelly, I would like to echo Pat's comments. In particular, the lack of support for automatically assigning a default set of entitlements to a new enterprise login. In other words, the first time an enterprise user logs into a component of the ArcGIS platform, they would get immediate access to a predefined combination of ArcGIS Online, ArcGIS Pro, GeoPlanner, Business Analyst, Insights, etc. They would get the access when they need it, rather than placing a barrier in their way, and generating an extra, per-user administrative burden on the institution. (This is for enterprise logins, not arcgis logins.) Addressing that need (as it sounds like you might in the next release?) will also eliminate a number of related issues; issues which typically only occur as a result of people struggling to deal with that simple piece of missing functionality. Whether an institution implements scripting or manual workarounds for that missing piece, it generates a number of additional failure points and/or extra work. Not what one expects from software sold as an enterprise-class, scalable, Software as a Service (SaaS) system. The introduction of User Types perhaps is a step toward solving the problem, but it doesn't at the moment. I suspect the introduction of user types is setting up some longer-term changes? Otherwise, like Pat, I find it to be confusing, and I am failing to the see the benefit at the moment... If you do not plan to implement a way to specify a default set of entitlements for enterprise logins in the next release, then perhaps the user type capability can be leveraged instead? Since you can specify a default user type for a new enterprise login, will you be adding the ability for administrators to define a custom user typer? That would lead to a relatively simple, yet powerful, workflow for Administrators: they would create a custom user type, set the desired combination of entitlements for that user type (from all available in the instance, not just Pro), and then set that user type as the default for new enterprise logins. Then, when a new enterprise login hits the system the first time -- and automated join is configured -- they can start working right away, without being stuck waiting on an automated script or manual intervention by an admin. On the enterprise login configuration page you can specify that a new user can join automatically, and default values for Role, Group(s), credit quota, Esri Access, etc., but not a default set of entitlements. To get around this oversight, some institutions, including mine, periodically run an automated Python script to provide the missing license auto-provisioning capability. Requiring an institution to have to perform this task using a script is, I believe, the largest, single barrier to broader adoption of the ArcGIS platform in the educational realm. (I suspect it is, or will be soon, a barrier for any large customer, education sector or otherwise, that needs to automate things, after they enable enterprise logins. Having to know scripting to have a scalable system, that provides a basic level of functionality, is not a typical expectation for SaaS. Furthermore, many people tasked with ArcGIS Online Administration, in education or otherwise, are not possessed of the level of scripting expertise required. You cannot have thousands of users dependent on a script hacked together from examples in the ArcGIS API for Python. Such a script needs to be to robust, secure, scalable, provide logging, handle exceptions, etc., if it is to be successful and reliable in an enterprise environment. Additionally, if scripting isn't an option for an institution, then the manual management requirements scale linearly with the number of users. The more users, the more administrative workload; again, not a typical expectation for a SaaS system. If the issue with automated licensing/entitlements were addressed, then the requirement for scripting expertise and/or cumbersome, manual management would be eliminated, or significantly reduced. This change would enable the many institutions with ArcGIS administrators, who don't have the time for manual tasks, nor the scripting expertise, to empower all their members with access to ArcGIS. There will always be the occasional exception of course. This functionality, however, is about setting the majority of the institutions implementing ArcGIS up for success, rather than failure, as they scale-up access across their organizations. Enabling enterprise logins, or Single-Sign-On, is also a barrier for some institutions, however, it is one with a clear solution, once the right people at an institution are involved in the conversation. Simply enabling enterprise logins, however, without also being able to auto-provision entitlements, still leaves one with the significant barrier of manual effort or scripting to support broad adoption. Currently many institutions are trapped by the "more users, more time commitment" issue. That approach does not scale. Rather, it sets up an institution for failure; or, just as bad, forces them to implement a digital divide; creating an environment of haves and have-nots, where, even though they have an institution-wide license, some users get access while others are denied. Thanks for reaching out here for input on this topic. Hope you find the above helpful as you plan the next release! Cheers, -peter
... View more
12-11-2018
05:56 AM
|
4
|
2
|
2457
|
| Title | Kudos | Posted |
|---|---|---|
| 2 | 11-03-2025 08:52 AM | |
| 1 | 11-03-2025 08:18 AM | |
| 1 | 11-02-2025 08:44 AM | |
| 1 | 10-15-2025 07:50 AM | |
| 2 | 10-13-2025 02:25 PM |
| Online Status |
Online
|
| Date Last Visited |
12 seconds ago
|