POST
|
Thank you Mike. That QueryDataElement REST endpoint does indeed return details about attribute rules for the underlying data. I can see the attribute rules being returned from some of my testing services where the attribute rules do appear in Pro when inspecting the service through the data design menu. However, I also ran it against some of services where Pro is not showing the attribute rules. For these the QueryDataElement endpoint returns this error: { "error": { "code": 400, "message": "Unable to complete operation.", "details": [ "Parser error: Some parameters could not be recognized. " ] } } So I assume that Pro is getting the same message, but now displaying it for the user. And still going to do a few more tests, but it seems that including a query layer amongst all the other typical feature classes and geodatabase registered standalone table in the service is causing this error to occur. Referencing a database view (either registered with the geodatabase or not) is probably going to be the workaround for this. I'll try to respond with the final results when we close the ticket.
... View more
07-30-2025
07:08 AM
|
0
|
0
|
315
|
POST
|
Thanks Mike. I'm in the middle of troubleshooting with support, as the issue seems intermittent. I'm going to be rebuilding things today and verifying as I go. I think the rules are eventually firing after a row is saved or committed in the standalone table in my Pro project. I have noticed this other related bug related to the service. I have a geodatabase referenced feature service that I've been publishing and republishing as I build out my schema for editing. So I've been writing attribute rule code, then publishing, testing, and republishing as needed. Once you publish the service, and load it into a Pro map contents, you can right click a layer in the service and see the "Data Design" menu and choose "Attribute Rules". Sometimes this will show the attribute rules on the feature class/table referenced by that layer, but sometimes it is blank...even though I know there is an attribute rule on the table. This is what led me to think something is being dropped when the service is published. Also, the reason I was thinking something gets published to the service is because of the "Exclude from Application Evaluation feature. Hussein explains it pretty well in this blog article I just found this morning linked below. In order for the client application to be able to fire attribute rules client-side it needs to know what they are...and so I assumed they are defined in the service definition REST somewhere. Either that or somehow the service is making a special call to the geodatabase to retrieve the rules and send them to the user's Pro client at some point prior to edits being made. Attribute Rules - Exclude from Application Evaluat... - Esri Community
... View more
07-28-2025
07:04 AM
|
0
|
1
|
334
|
POST
|
I am trying to troubleshoot an issue with editing data in an enterprise geodatabase through a feature service in ArcGIS Enterprise 11.3 Does anyone know if information about attribute rules on feature classes is put into a feature service when you publish a reference service to Enterprise? And if so, where can I inspect that attribute rule definition in the service's JSON? I'm asking because I've run into an issue where if I edit the feature class with a calculation attribute rule applied, it will work if I edit the feature class directly from Pro. But if I make the same edit through the service referencing the feature class, the attribute rule doesn't fire. This made me think that somehow the service is not aware of the attribute rule, so I wanted to verify.
... View more
07-22-2025
10:54 AM
|
0
|
6
|
413
|
IDEA
|
In ArcMap, you could edit data in tables or feature classes that were not registered with an enterprise geodatabase. Thad made it convenient to make small edits to ancillary tables in the database when needed rather than having to open up a separate tool to make the edits like SSMS, Oracle SQL Developer or PGAdmin. This capability was never added to Pro. Instead when you try to edit a non-registered table standard table all the editing features are greyed out and you get a warning "This enterprise database table is not registered with the geodatabase. Edits cannot be made" I would like to see this ability added to Pro so it works like it did in ArcMap. I know this has it's limitations, like having to start an edit session to make edits, click save to commit, and lack of undo functions), but sometimes you just want to use Pro to do this kind of work instead of opening separate applications. I'll also say that just registering these other tables everything with the database isn't always an option...for example in SQL Server you can't register a table that uses an identity column as it's primary key....which are much less of a pain then managing sequences. I'll also clarify that I think this would apply to tables or feature classes in a relational database that has been configured as an Esri Enterprise Geodatabase, but just aren't registered for specific business needs. Or it could also be tables of feature classes in a relational database that is not configured as an Esri Enterprise Geodatabase, such as a SQL Server or Postgres database with some tables with geometry columns.
... View more
06-16-2025
02:17 PM
|
1
|
0
|
240
|
POST
|
I just published my first hosted image layer to ArcGIS Online. It is a "tiled imagery layer" comprised of just one raster. I wanted to check how much storage it is taking up in AGOL (and thus how many credits it uses for storage daily), but the size section of the item page says 0KB. I even tried running the item report as an AGOL admin and the item's file storage and feature storage values are zero. The source raster I uploaded is a 20MB TIF with LZW compression. So I'd imagine it's AGO counterpart would be at least that size or larger due to the tiling. Has anyone had experience finding the size information for these imagery item types? Thank you for any information you can provide, -Andrew
... View more
03-19-2025
11:47 AM
|
0
|
0
|
300
|
IDEA
|
Please add the ability for ArcGIS Online layers to be displayed as Hub content using their default symbology. Currently all hosted feature layers displayed in a Hub site's content is given a default blue symbology. Many GIS staff create default symbology for ArcGIS Online layers especially when for data that relies on some categorization to make sense. Of course someone can change that default after they add it to a map or download it, but often there is a common symbology for a layer that staff in an organization like to use as a "one size fits most" when adding data to applications. It would be helpful to have these datasets displayed to the public in the same way, rather than the symbology going back to generic colors. For some datasets this simple color scheme makes it hard to communicate the value of a particular dataset to the public when they first glance at the data from a Hub content explore page.
... View more
10-11-2024
12:58 PM
|
16
|
4
|
946
|
IDEA
|
Please add an option in Real-time Analytics that allows an input data source from a hosted feature layer to auto-refresh on a schedule. In my example. I have a feed of data from a data source. I then need to join on information from a hosted feature layer. The hosted feature layer gets updated once every night from an ETL operation. This community article (Link) confirms that the real-time analytics only uses data in the source input when the analytic is started... meaning the data could be days, weeks, or months old if the analytic is stable and never needs to be restarted. I'd prefer not to have to log in and manually restart the analytic daily for the join table info to update It would be great if there was an extra option on the Source -> Feature Layer configuration window that would allow the user to define a refresh interval in hours or days. Going one step further, I think some feature layers have a property that says the last timestamp when the data was refreshed, so it might be possible for the Analytics to ping that timestamp on an interval and only ingest the refreshed data when it determines the source has changed. Thank you
... View more
09-20-2024
09:19 AM
|
5
|
1
|
554
|
POST
|
Hello, I'm relatively new to attribute rules, and I'm currently exploring options for using batch calculation as an alternative to immediate calculation to speed up some bulk processing after large nightly data imports to my data. I'm trying to create a batch calculation attribute rule, and I only need it to run on a subset of features in a feature class. I'm curious if there is a way to apply the filter on the attribute rule so it only runs on features that meet the conditions in a SQL whereclause. My hope is this will speed the batch calculation run time. So, for example I'd like a batch calc attribute rule that does something like this: For Bulk Calc Attribute Rule A, only run it against rows in Feature Class B that match SQL WHERECLAUSE C. For records that match the whereclause, update one attribute value on the row. All other rows on the feature class are ignored. I guess the simple way to do this is to put an IIF() statement inside the Attribute rule. But this means the attribute rules will evaluate every row in the feature class. I already know this one attribute rule would only apply to features with specific values, which would drop the number of rows to assess from 100k+ to only 100 or so. I guess there is a way to use a "FILTER()" function, but I haven't seen an example of sticking that in an attribute rule.
... View more
01-25-2024
08:55 AM
|
0
|
0
|
878
|
POST
|
Hello, I'm relatively new to attribute rules. I'm trying to build them into a new schema I'm developing to handle primary key population with a sequence using the NextSequenceValue() function. I've noticed that just by adding this one simple attribute rule, the time to bulk import features to the feature class slows significantly. For my test I used a sample line dataset of 1,000 rows. When importing data through the Append tool, with the rule disabled, the data is loaded in seconds. When the sequence attribute rule is enabled, it takes about 1.5 minutes. And this is being done from Pro on a machine at the same datacenter as the Oracle geodatabase I'm loading into. 1.5 minutes is okay, but that's just for 1,000 rows. With this schema I'm building, one major workflow is loading data into the schema from a 3rd party, and some of the 1:M tables related to the line feature class could get 1,000's of rows each every day. I'm worried the load time will be so long with this attribute rule, that I'm going to hog our ETL scripting server all night with just this one job. Same thing happens with loading the data via FME I'm wondering if I should fall back to alternative ways for generating primary keys outside of the Esri ecosystem. For example, creating a trigger at the database level to grab sequence values on row inserts. Or running a query in the ETL tool to grab sequence values from the database upstream from the geodatabase import. These feature classes won't be versioned or archive enabled, so I don't think the trigger population would be an issue. Although attribute rules are great for managing some key attributes for transactional edits done by analysts throughout the day, they seem to really slow down bulk loads. I'm curious if folks out there have created some mitigation strategies for speeding up nightly bulk loads while keeping your attribute rules intact or daily edits.
... View more
01-23-2024
07:31 AM
|
0
|
0
|
941
|
IDEA
|
The implemented enhancements in ArcPy.Describe for Pro 3.2 don't return a dateModified property for feature classes in my 10.9.1 Oracle geodatabase. Nor does it work for feature classes in my 10.9.1 Azure SQL Database. It does for a file geodatabase feature class though. So really the enhancement is Arcpy describe will return size and dateModified if the underlying geodatabse supports it... and enterprise geodatabases (at least at 10.9.1) don't support it. I assume this is because the SDE business tables don't contain any kind of auditing fields for date created or date modified for geodatabase items...because I've hunted for something like this and no luck. I'd be curious if the value would be returned by ArcPy if the Oracle geodatabase was at version 11.2. But I assume those columns would have to be added to the SDE business schema somewhere and populated to make the data available for Arcpy to grab. I could not find any documentation specific to what's new in enterprise geodatabase functionality at 11.2 so can't verify this was done.
... View more
12-19-2023
07:48 AM
|
0
|
0
|
3094
|
IDEA
|
Experience builder excels at configuring an interactive web interface design with buttons, pages, etc. Dashboards and Insights are great at data visualization with more charting options. There are some features that overlap, bur for example the bar charts in Experience builder are much lest sophisticated then those in ArcGIS Dashboard apps. So currently you have to make a choice between one or the other. It would be great if I could combine the two by embedding a Dashboard or Insights app inside Experience builder with interactivity. Then actions on the Experience GUI like filtering on a menu dropdown or selecting something could be passed to the Dashboard or Insights app as parameters to do filtering on the various charts. AN example might be an experience app for conveying information about a nationwide program. There are dashboards or insight apps that complement the story that have already been built. The Experience provides users a drop down of states so they can filter the data in the experience to a single state. The enhancement would allow the experience to extend that state filter to the various widgets in the Dashboard or Insight app.
... View more
12-14-2023
07:15 AM
|
10
|
0
|
808
|
IDEA
|
Our ArcGIS Online Org contains 1,000 groups, 2,500 users, and 30,000 items. I am trying to build BI reports on the usage & access to data within our org. The scheduled reports for items and users are great for understanding what's available and who owns them. However, what's missing is who has access to what, and ArcGIS Online groups are the relational key between users and items they might use. It would be helpful to have an additional report that provides this information in bulk. How would I do this today? I've looked at the REST documentation, and unfortunately I'd have to make thousands of calls to get membership and contents of each group. First I'd have to query the endpoint https://org.arcgis.com/sharing/rest/community/groups to retrieve an array of all the groups and their basic properties. These results don't contain info about user that are members or items shared to the group. Because of that I need to call two separate endpoints for each group. For the member users, I need to loop over each group and query https://org.arcgis.com/sharing/rest/community/groups/<groupid>/userList . Then I'll need to re-query with pagination if the user count of a group is above 100. For the items, I need to loop over each group and query https://org.arcgis.com/sharing/rest/content/groups/<groupid> . Then I'll also need to re-query with pagination if item count is above 100. That's 2000+ HTTP calls to the REST API, so a lot of back and forth latency and load on the API. What would the solution look like? I think it would be better to have a process within ArcGIS Online that can quickly extract this data in bulk and export to some kind of report. I'd like to see the report contain some info about each group (ID, Title, isInvitationOnly, Owner, tags, created, modified, access, protected, autoJoin, isOpenData, is it a shared update group, etc.). Then it would also have a list of usernames that are members of each group, preferable also with each user's member role in the group. Finally the report would also have a list of the Item ID's for items shared to the group. When combined with the existing item and user reports in another tool like Excel or Power BI, this would be enough to analyze the relationships. Although all the existing reports are provided in CSV format, for this kind of report JSON seems better equipped to handle the array of usernames and item id's. I'm not very familiar with CSV standards, so maybe there is a way to nest a comma separated list of usernames or item id's inside using double quotes around the list. Alternatively maybe it needs to be 3 different CSV reports: First has one row per group with the group's properties. Second is a report of usernames and group Id's they are members of, and the users member role in the group. Third is a report of item id's and the groups they are assigned to.
... View more
12-14-2023
06:52 AM
|
19
|
0
|
1332
|
IDEA
|
The new Generate Schema Report tool in Pro is a great way to export the details of all objects in a geodatabase. However, depending on which output format you choose, the data types provided for each geodatabase item will be something more intended for code rather than the common name displayed in the software to end users. For example, I have a Feature Dataset in my geodatabase. And it looks like this when you look at properties in the Pro catalog interface: When I run the tool for an HTML output, it labels it as dataset type = 'FeatureDataset'. Notice there is no space. In a similar fashion, in the JSON output of the tool gives the type as DEFeatureDataset Similar things happen for geometry types of feature classes (point, line, polygon, etc.), and field types (OID, Geometry, String, Integer, Date, etc.) These are not the types users see when they look at the Fields view of a feature class in Pro. Although I'm sure these types of names are great for coding and play their role in the SDK and backend, I'd prefer adding an additional property of the output with a proper human label that matches what all my GIS analysts see when they are looking at data in Pro through the catalog interface. Having those end-user friendly labels will help in using these reports in engagements with users about schema. I guess I could convert all these values to whatever the Human readable Pro UX label is, if I knew where the definitive list exists, but it would be great if it was in the raw output.
... View more
11-08-2023
06:34 AM
|
4
|
2
|
948
|
POST
|
I'm looking for a way to easily detect all feature classes in an enterprise geodatabase that are configured with branch versioning. Anyone found a straightforward way to do this in an automated fashion? Arcpy's describe provides a way to determine if the feature class is versioned, but no way to distinguish between traditional vs branch. So that's led me to look at the SDE business tables, and I found two possible candidates. I'm curious if anyone has had more experience with these since I can't find documentation on what each do. BRANCH_TABLES_MODIFIED is the first candidate. It has a BRANCH_ID column with many of the rows having a zero in this column. Then there is a REGISTRATION_ID which seems to be the same ID as on SDE.TABLE_REGISTRY. I assume that BRANCH_ID = 0 equates to default version for each table. So if I filter to BRANCH_ID=0 and join to TABLE_REGISTRY does that get me the comprehensive list? MULTIBRANCH_TABLES is another candidate with a REGISTRATION_ID and a START_MOMENT date column. I seem to get fewer tables back from this.
... View more
11-01-2023
12:52 PM
|
0
|
7
|
2860
|
IDEA
|
Power BI has many out of the box connectors to bring data from external sources (ex. Salesforce) into their data model. It would be helpful if ArcGIS Online (and Enterprise) was one of these options. Often GIS layers are the database of record for data assets in an organization., so they are useful for their tabular data in addition to the spatial aspect. That data then needs to be joined to information from other systems like maintenance and permitting to report information useful to the business. Often Power BI modelers need to add measures based on the data to make create reports. The ArcGIS For Power BI visual is a great start since it allows data from a model to be joined to GIS data for map visualization. But it prevents incorporating data further upstream, so it can be processed in Power Query or incorporated into measure functions. As for the spatial column aspect, Power BI seems to honor GeoJSON as a data format, it would be helpful if this proposed connector could bring the spatial geometry from the Esri service into a Data model column. At the 2023 UC Technical Workshop "ArcGIS for Microsoft 365: An Overview", I saw that Esri is trying to accommodate this need by providng a Power Automate template workflow that uses the ArcGIS Power Automate premium connector to read a GIS service, and write it to a csv in One Drive so it can then be imported to a Power BI model. Although this may be better for some user's workflows, it feels like a workaround that requires adding additional data hops. Building a Power BI connector would further development along the same lines to make getting data from ArcGIS into Power BI models quick and easy from a single interface Matthew Roche at Microsoft has a maxim about where the best place to do data processing should occur in the hops from source data to Power BI model. "Data should be transformed as far upstream as possible, and as far downstream as necessary". If Power BI had a "Get Data from ArcGIS" connector, that would help modelers in Esri shops follow this principle when data. Here's a 20 minute presentation where he talks about it: Roche's Maxim of Data Transformation - SQLBits Presentation
... View more
07-14-2023
07:32 AM
|
9
|
0
|
1301
|
Title | Kudos | Posted |
---|---|---|
1 | 06-16-2025 02:17 PM | |
16 | 10-11-2024 12:58 PM | |
5 | 09-20-2024 09:19 AM | |
3 | 07-28-2022 08:55 AM | |
10 | 12-14-2023 07:15 AM |
Online Status |
Offline
|
Date Last Visited |
Thursday
|