|
IDEA
|
I am trying to create a python script to catalog all items in our geodatabases and their configuration using arcpy.Describe. Currently there is no way to determine if a feature class, table, or other item is branch versioned. The closest thing is the '.isVersioned' property for datasets (link). However, this is just a boolean value. So if a dataset is versioned there is no way to return if it's configured for traditional versioning or branch versioning. This is becoming important for us as we transition many of our geodatabase items into a ArcGIS Enterprise web services model for editing. It would be helpful to run a scan of all geodatabase items and return a table documenting which are not versioned, which are still traditional, and which are now branch to assist with our transition. I guess the best way to add this information to Arcpy is to add a new property named "versionType" to the dataset object(link). If the dataset is versioned it would return the value "Branch" or "Traditional". And if the dataset isn't versioned then I guess this property could have a null value or just not be present at all on the object.
... View more
09-20-2022
08:24 AM
|
21
|
7
|
3975
|
|
IDEA
|
Please add Microsoft's python libraries for interacting with Azure Blob Storage and Azure Data Lake storage to the standard installed libraries. After processing some reports in Notebook server about our geodatabases, Enterprise portals ,and ArcGIS Online environments, I want to then write the csv results to a folder in an Azure Data Lake. However, those libraries aren't in the standard library list. With notebook server focused on data science and ML, I figure being able to read and write from Azure, AWS, and Google's Blob/Data Lake storage platforms would be a common use case. Thank you,
... View more
08-21-2022
01:28 PM
|
0
|
1
|
1482
|
|
IDEA
|
Please create some way to obtain the definition query applied to existing map/feature services that reference an enterprise geodatabase. From my experience, once the service is published from Pro, there is no way through the REST API to determine if a layer in the service is applying a definition query against it's source geodatabase feature class. On the admin side of the REST api, you can access the manifest .json file, but it only tells you the feature class names....not if the definition query is applied. The only way I've been able to find some info on this is to inspect the manifest .json, find the name of the onserver .msd file that was uploaded by Pro, log into the server and browse to that file, then open it and inspect the definition query on the layer.
... View more
07-28-2022
08:55 AM
|
4
|
0
|
1542
|
|
IDEA
|
Add a column for the folder the item is in to the administrative item report. The folder item's are stored in often helps with organization, especially for accounts that manage many items on behalf of the organization. Having these folder names on the report next to each item would help admins filter data on the output report csv data. I will create a separate idea for this to be implemented on this report's counterpart in ArcGIS Enterprise 10.9+
... View more
07-06-2022
06:02 PM
|
0
|
0
|
308
|
|
BLOG
|
@Anonymous User , I tried to click the link for the "Create Detailed AGOL Usage Report for Every Item " bullet point, but the article was removed. Was wondering how this was partially implemented so I can go try it?? Trying to get daily (or at least monthly) counts for 20,000+items in our org on a regular basis, without blowing up the API's.
... View more
06-22-2022
08:55 AM
|
0
|
0
|
462
|
|
POST
|
Thanks Josh, I'm not that getting those elements when I navigate to the json for my layer. I'm looking at this URL logged in with an admin account: https://<myArcGISServerUrl>/server/rest/services/<MyServiceName>/MapServer/0?f=pjson my portal and arcgis server are 10.9.1 Was your example from a hosted feature service by chance? I found this article that walks through how to navigate to this spot in the json, but it's for a hosted feature service. In my scenario I am trying to find the definition query for a layer in a service referencing an enterprise geodatabase. The service creation process would be something like 1)a GIS Analyst opens pro and adds a feature class from the geodatabase to the map, 2) adds a definition query to the layer, 3) then publishes to Portal or Server as reference with features access. Now 6 months later someone asks me the admin "why doesn't this feature layer have all the features in it from the original feature class? -Andrew
... View more
06-17-2022
12:02 PM
|
0
|
0
|
3447
|
|
POST
|
I'm trying to determine if a definition query is applied to a layer in a feature service referencing one of our enterprise geodatabases. Does anyone know how to grab the applied definition query via the REST or Python API's? Here's where I've looked so far. The REST API response(documentation link) for a service layer doesn't contain this information. I am able to grab the service manifest via REST (documentation) and inspect the JSON. It contains the connection properties for source geodatabase, and the name of the feature class used by each service layer. However but it doesn't contain any information about the definition query either. The manifest does show a serverpath for the location of the service's .MSD file on the server's disk. Therefore I'm guessing this definition query information only lives within the MSD file itself. My server admin was able to grab the msd file for me off disk and I verified the definition query is buried in there. I guess I could write a script to grab the file and parse it, but figured this would be somewhere in the server admin interface or API's.
... View more
06-17-2022
10:05 AM
|
1
|
5
|
3506
|
|
POST
|
Thank you both for your responses. I got confirmation from the product team that state zero is not required for the upgrade. After speaking to our team about the history, seems like there were some problems in the early years of our geodatabases, circa 2007-2012. Details are fuzzy, but at the time there were problems with the upgrade. One of the things support recommended was "let's try upgrading with things unversioned", and so after that we started doing it as a practice to be conservative. We usually used the opportunity to do large schema revisions anyways so unversioning wasn't a bad thing anyways. Still good to know what's required and what's a nice to have. Thanks again!
... View more
06-02-2022
12:44 PM
|
1
|
0
|
1692
|
|
POST
|
Does anyone out there follow a practice of compressing your enterprise geodatabases to state zero before upgrades? For example when upgrading your geodatabase version from 10.6.1 to 10.7.1, or upgrading the Oracle version from 12c to 19c. We have done it in my organization on our Oracle geodatabases as far back as we can remember. I think it was a requirement in really early versions of the geodatabase that supported versioning (8.3 or 9.0??) or maybe it was required to avoid a bug in early versions, but I can't find anything to back that up. Regardless, compressing to state zero prior to upgrade is not spelled out as a requirement in any of the documentation for recent versions. I'm wondering if we could start eliminating this step as it seems to just be a cautionary one to make the geodatabase a little simpler before the upgrade, so I'm just curious if others do upgrades with lots of transactional versions floating around without any issues. Thank you, Andrew
... View more
05-27-2022
07:12 AM
|
0
|
3
|
1760
|
|
IDEA
|
Add the ability to filter a hosted feature layer by a subquery for both definition queries and "select by attribute" workflows. For me, I want to build a hosted view on top of a hosted layer. The hosted view would limit the results to only the "most recent record" for each ID that exists in a column. This query requires a date column and the ID column which is not unique. In a sql database this query would take on the following syntax: For the example the table we are filtering is named myTable. The column with ID's we want to group by is called myGroupID. The column with dates to use for finding most recent record is named myDate select * from myTable t1 where t1.myDate = (SELECT max(t2.myDate) from myTable t1 where t2.myGroupID = t1.myGroupID) Other scenarios are just filtering the whole table to the one with a max value over all: select * from myTable t1 where t1.myDate = (SELECT max(t2.myDate) from myTable This came up a lot with Covid dashboard data. You have a table with one row per day with various Covid numbers. Then some places you want to filter that master dataset to data from the latest row when sorted by date. In other places you want to use all the data to see trends. The operations dashboard widgets provide the extra configuration steps to do this kind of sorting and filtering. But applying the same kind of filtering to the hosted layer itself via a hosted view is missing. Now, I assume that on Esri's side this would require the underlying ArcGIS Online data platform to support some kind of SQL structure. But it wouldn't have to be RDBMS like PostGres or SQL Server. Azure Synapse allows relatively advanced SQL queries against data lakes that include subqueries and aggregation functions. Also, I assume that operations like this will add more load on the Esri systems since the filter needs to scan other rows and compare. But this is handled in RDBMS with indexes. There is already the concept in ArcGIS Online of tagging one column on a layer to be the "time aware" layer, and the trackaware GeoAnalytics tools in Pro require you to select a "Track ID" column. If we could define these columns on the AGOL layer, they could be used to selectively build indexes on the backend to optimize these kinds of queries.
... View more
04-20-2022
08:22 AM
|
13
|
0
|
1034
|
|
IDEA
|
The new credit usage report is great for showing what users burned in terms of credits over a month or week. However, the report deployed in the Sep 2020 AGOL release does not include credit usage for feature storage, file storage, and tile storage. These missing categories make up 98% of our org's credit usage, so it doesn't help us much with looking for accounts that are using the most of our credits. Also, I noticed that the date range at the top of the report is given in UNIX Epoch integer. A date format that is both human readable and easy to parse with common BI date functions would be better, like ISO standard 8601. So rather than Aug 1, 2020 UTC appearing as 1596240000, it would be 2020-08-01T00:00:00Z.
... View more
09-30-2020
01:43 PM
|
9
|
5
|
1942
|
|
BLOG
|
I have 2 questions about the "Database Storage Details" report you get in the last step of your workflow. #1: Is it filtered by the dates selected at the top of the Dashboard page? or is it just giving me the storage size of the items as of the time I run the report? #2: Is it only showing the actual feature storage size separate from attachments? For example, let's say I have a 5GB layer that's comprised of 1GB of hosted feature layer, and 4GB of attachments. Is it correct to assume if I look at the item's main page I'd see the total size to be 5GB, but on this report the item would just show 1GB? Thanks for any info you can provide, Andrew
... View more
08-24-2020
01:09 PM
|
0
|
0
|
829
|
|
IDEA
|
ArcGIS Pro provides a nice GUI for updating python libraries used by the software. For folks like me this makes it so much easier to get started with the python features, since I don't have to worry about learning Conda. However, when I hear about a new version of the arcgis python library being available, when I go to the Python Package Manager, it is not shown as an option to update. I am curious why this flagship library from Esri is not available for update through this window when so many smaller ancillary libraries are available. I do know that the Python API help has a page explaining how to do the update via the conda terminal: Install and set up | ArcGIS for Developers . However, the instructions don't look like they've been updated since 2.2. Also, it is a little clunky to go through these steps when Pro has a feature to make updates easier.
... View more
07-29-2020
06:50 AM
|
1
|
2
|
1552
|
|
IDEA
|
Currently ArcGIS Online allows layers to be non-editable for the public, but if the item is in a "group with update capabilities", the members of that group can add the layer to an AGOL map with full editing control and make edits. However, if these users add these layers to ArcGIS Pro, they cannot edit. I would like for the group editing ability on the layer be honored in Pro. I guess the current workaround is to make the hosted layer editable and share that with the group, then make a hosted view that is non-editable, and share that with the public. It's just extra items for us to maintain.
... View more
05-14-2020
06:52 AM
|
8
|
0
|
930
|
|
POST
|
FYI to everyone that you can't change the view definition for the cases standalone table from the visualization tab of the view. That tab only shows the layers with geometry and hides layers that are simple tables. So you have to add the view to a map, click the table in the contents and then click "set definition".
... View more
03-20-2020
06:02 AM
|
1
|
0
|
1274
|
| Title | Kudos | Posted |
|---|---|---|
| 4 | 3 weeks ago | |
| 1 | 06-16-2025 02:17 PM | |
| 18 | 10-11-2024 12:58 PM | |
| 5 | 09-20-2024 09:19 AM | |
| 4 | 07-28-2022 08:55 AM |
| Online Status |
Offline
|
| Date Last Visited |
2 weeks ago
|