|
IDEA
|
We have several aprx files in our organization, and I'm trying to iterate over them and determine the data sources for each layer in them. We have some query layers that are commonly used throughout our organization for viewing certains datasets in our enterprise geodatabases. Currently when I use Arcpy.Describe on these query layers, the dataset type returned is 'FeatureLayer, which isn't accurate. Through the Pro UI, this same layer shows a type of "Query Feature Layer" when I inspect it's properties. The propertie window also shows the actual database query used by the layer, a property that is unique to query layers. It would be great if query layers returned a unique dataType when being described so I know to process them a little differently. Then it would also be nice if the describe object returned also included a "Query" property that contains the text of the SQL query. This way I can parse that text and determine what table (or multiple tables) are referenced by this map layer.
... View more
10-06-2022
02:35 PM
|
6
|
3
|
3868
|
|
IDEA
|
Add a new arcade function to the library that allows the code to call a database function. The value returned by the function will be populated into the field. This could be very similar to the NextSequenceValue function, but instead of providing the sequence name, the user would provide the database function name. For my organization, this would allow us to build attribute rules that could call other Oracle databases (like our permitting system) via database links. For example if a GIS user is entering a new subdivision record, they would enter the permit number in one column. This new feature would allow us to call a database function with that permit number, retrieve information about the permit from the other database via a database link, and then auto-populate that information on the GIS record in other columns.
... View more
10-06-2022
05:57 AM
|
1
|
3
|
1918
|
|
IDEA
|
I noticed that the copy and paste does not bring over the precision and scale if they are present. So I guess I'll open up a follow up enhancement request along the lines of "copy and paste all columns in the fields view regardless of what is displayed."
... View more
09-21-2022
11:16 AM
|
0
|
0
|
6849
|
|
IDEA
|
I am trying to create a python script to catalog all items in our geodatabases and their configuration using arcpy.Describe. Currently there is no way to determine if a feature class, table, or other item is branch versioned. The closest thing is the '.isVersioned' property for datasets (link). However, this is just a boolean value. So if a dataset is versioned there is no way to return if it's configured for traditional versioning or branch versioning. This is becoming important for us as we transition many of our geodatabase items into a ArcGIS Enterprise web services model for editing. It would be helpful to run a scan of all geodatabase items and return a table documenting which are not versioned, which are still traditional, and which are now branch to assist with our transition. I guess the best way to add this information to Arcpy is to add a new property named "versionType" to the dataset object(link). If the dataset is versioned it would return the value "Branch" or "Traditional". And if the dataset isn't versioned then I guess this property could have a null value or just not be present at all on the object.
... View more
09-20-2022
08:24 AM
|
21
|
7
|
5200
|
|
IDEA
|
Please add Microsoft's python libraries for interacting with Azure Blob Storage and Azure Data Lake storage to the standard installed libraries. After processing some reports in Notebook server about our geodatabases, Enterprise portals ,and ArcGIS Online environments, I want to then write the csv results to a folder in an Azure Data Lake. However, those libraries aren't in the standard library list. With notebook server focused on data science and ML, I figure being able to read and write from Azure, AWS, and Google's Blob/Data Lake storage platforms would be a common use case. Thank you,
... View more
08-21-2022
01:28 PM
|
0
|
1
|
1655
|
|
IDEA
|
Please create some way to obtain the definition query applied to existing map/feature services that reference an enterprise geodatabase. From my experience, once the service is published from Pro, there is no way through the REST API to determine if a layer in the service is applying a definition query against it's source geodatabase feature class. On the admin side of the REST api, you can access the manifest .json file, but it only tells you the feature class names....not if the definition query is applied. The only way I've been able to find some info on this is to inspect the manifest .json, find the name of the onserver .msd file that was uploaded by Pro, log into the server and browse to that file, then open it and inspect the definition query on the layer.
... View more
07-28-2022
08:55 AM
|
4
|
0
|
1806
|
|
IDEA
|
Add a column for the folder the item is in to the administrative item report. The folder item's are stored in often helps with organization, especially for accounts that manage many items on behalf of the organization. Having these folder names on the report next to each item would help admins filter data on the output report csv data. I will create a separate idea for this to be implemented on this report's counterpart in ArcGIS Enterprise 10.9+
... View more
07-06-2022
06:02 PM
|
0
|
0
|
440
|
|
BLOG
|
@Anonymous User , I tried to click the link for the "Create Detailed AGOL Usage Report for Every Item " bullet point, but the article was removed. Was wondering how this was partially implemented so I can go try it?? Trying to get daily (or at least monthly) counts for 20,000+items in our org on a regular basis, without blowing up the API's.
... View more
06-22-2022
08:55 AM
|
0
|
0
|
913
|
|
POST
|
Thanks Josh, I'm not that getting those elements when I navigate to the json for my layer. I'm looking at this URL logged in with an admin account: https://<myArcGISServerUrl>/server/rest/services/<MyServiceName>/MapServer/0?f=pjson my portal and arcgis server are 10.9.1 Was your example from a hosted feature service by chance? I found this article that walks through how to navigate to this spot in the json, but it's for a hosted feature service. In my scenario I am trying to find the definition query for a layer in a service referencing an enterprise geodatabase. The service creation process would be something like 1)a GIS Analyst opens pro and adds a feature class from the geodatabase to the map, 2) adds a definition query to the layer, 3) then publishes to Portal or Server as reference with features access. Now 6 months later someone asks me the admin "why doesn't this feature layer have all the features in it from the original feature class? -Andrew
... View more
06-17-2022
12:02 PM
|
0
|
0
|
5057
|
|
POST
|
I'm trying to determine if a definition query is applied to a layer in a feature service referencing one of our enterprise geodatabases. Does anyone know how to grab the applied definition query via the REST or Python API's? Here's where I've looked so far. The REST API response(documentation link) for a service layer doesn't contain this information. I am able to grab the service manifest via REST (documentation) and inspect the JSON. It contains the connection properties for source geodatabase, and the name of the feature class used by each service layer. However but it doesn't contain any information about the definition query either. The manifest does show a serverpath for the location of the service's .MSD file on the server's disk. Therefore I'm guessing this definition query information only lives within the MSD file itself. My server admin was able to grab the msd file for me off disk and I verified the definition query is buried in there. I guess I could write a script to grab the file and parse it, but figured this would be somewhere in the server admin interface or API's.
... View more
06-17-2022
10:05 AM
|
1
|
5
|
5116
|
|
POST
|
Thank you both for your responses. I got confirmation from the product team that state zero is not required for the upgrade. After speaking to our team about the history, seems like there were some problems in the early years of our geodatabases, circa 2007-2012. Details are fuzzy, but at the time there were problems with the upgrade. One of the things support recommended was "let's try upgrading with things unversioned", and so after that we started doing it as a practice to be conservative. We usually used the opportunity to do large schema revisions anyways so unversioning wasn't a bad thing anyways. Still good to know what's required and what's a nice to have. Thanks again!
... View more
06-02-2022
12:44 PM
|
1
|
0
|
2077
|
|
POST
|
Does anyone out there follow a practice of compressing your enterprise geodatabases to state zero before upgrades? For example when upgrading your geodatabase version from 10.6.1 to 10.7.1, or upgrading the Oracle version from 12c to 19c. We have done it in my organization on our Oracle geodatabases as far back as we can remember. I think it was a requirement in really early versions of the geodatabase that supported versioning (8.3 or 9.0??) or maybe it was required to avoid a bug in early versions, but I can't find anything to back that up. Regardless, compressing to state zero prior to upgrade is not spelled out as a requirement in any of the documentation for recent versions. I'm wondering if we could start eliminating this step as it seems to just be a cautionary one to make the geodatabase a little simpler before the upgrade, so I'm just curious if others do upgrades with lots of transactional versions floating around without any issues. Thank you, Andrew
... View more
05-27-2022
07:12 AM
|
0
|
3
|
2145
|
|
IDEA
|
Add the ability to filter a hosted feature layer by a subquery for both definition queries and "select by attribute" workflows. For me, I want to build a hosted view on top of a hosted layer. The hosted view would limit the results to only the "most recent record" for each ID that exists in a column. This query requires a date column and the ID column which is not unique. In a sql database this query would take on the following syntax: For the example the table we are filtering is named myTable. The column with ID's we want to group by is called myGroupID. The column with dates to use for finding most recent record is named myDate select * from myTable t1 where t1.myDate = (SELECT max(t2.myDate) from myTable t1 where t2.myGroupID = t1.myGroupID) Other scenarios are just filtering the whole table to the one with a max value over all: select * from myTable t1 where t1.myDate = (SELECT max(t2.myDate) from myTable This came up a lot with Covid dashboard data. You have a table with one row per day with various Covid numbers. Then some places you want to filter that master dataset to data from the latest row when sorted by date. In other places you want to use all the data to see trends. The operations dashboard widgets provide the extra configuration steps to do this kind of sorting and filtering. But applying the same kind of filtering to the hosted layer itself via a hosted view is missing. Now, I assume that on Esri's side this would require the underlying ArcGIS Online data platform to support some kind of SQL structure. But it wouldn't have to be RDBMS like PostGres or SQL Server. Azure Synapse allows relatively advanced SQL queries against data lakes that include subqueries and aggregation functions. Also, I assume that operations like this will add more load on the Esri systems since the filter needs to scan other rows and compare. But this is handled in RDBMS with indexes. There is already the concept in ArcGIS Online of tagging one column on a layer to be the "time aware" layer, and the trackaware GeoAnalytics tools in Pro require you to select a "Track ID" column. If we could define these columns on the AGOL layer, they could be used to selectively build indexes on the backend to optimize these kinds of queries.
... View more
04-20-2022
08:22 AM
|
15
|
0
|
1306
|
|
IDEA
|
The new credit usage report is great for showing what users burned in terms of credits over a month or week. However, the report deployed in the Sep 2020 AGOL release does not include credit usage for feature storage, file storage, and tile storage. These missing categories make up 98% of our org's credit usage, so it doesn't help us much with looking for accounts that are using the most of our credits. Also, I noticed that the date range at the top of the report is given in UNIX Epoch integer. A date format that is both human readable and easy to parse with common BI date functions would be better, like ISO standard 8601. So rather than Aug 1, 2020 UTC appearing as 1596240000, it would be 2020-08-01T00:00:00Z.
... View more
09-30-2020
01:43 PM
|
10
|
5
|
2570
|
|
BLOG
|
I have 2 questions about the "Database Storage Details" report you get in the last step of your workflow. #1: Is it filtered by the dates selected at the top of the Dashboard page? or is it just giving me the storage size of the items as of the time I run the report? #2: Is it only showing the actual feature storage size separate from attachments? For example, let's say I have a 5GB layer that's comprised of 1GB of hosted feature layer, and 4GB of attachments. Is it correct to assume if I look at the item's main page I'd see the total size to be 5GB, but on this report the item would just show 1GB? Thanks for any info you can provide, Andrew
... View more
08-24-2020
01:09 PM
|
0
|
0
|
1153
|
| Title | Kudos | Posted |
|---|---|---|
| 7 | 11-19-2025 07:03 AM | |
| 1 | 06-16-2025 02:17 PM | |
| 17 | 10-11-2024 12:58 PM | |
| 5 | 09-20-2024 09:19 AM | |
| 4 | 07-28-2022 08:55 AM |
| Online Status |
Offline
|
| Date Last Visited |
yesterday
|