|
IDEA
|
Please enhance the "Publish Web Layer" and "Overwrite Web Layer" tools in Pro so that they check that all layers & standalone tables in the service that have Editor Tracking Enabled use the same "Time standard" option. I have found that if one layer has "database time" and a second has UTC time, then the tool will allow the service to publish, but it will be corrupt. According to the documentation all layers in a service with Editor Tracking must use the dame Time standard option. I encounter this in ArcGIS Enterprise 11.5, Pro 3.5 when publishing a feature service referencing an enterprise geodatabase. In my situation the difference was an oversite when setting up editor tracking when adding some layers to a service. It would be helpful if the tool checked for this, informed the user, and prevented publishing.
... View more
a week ago
|
0
|
1
|
141
|
|
IDEA
|
Please enhance the Geotab LogRecord Connector so that it pulls the Bearing attribute from Geotab. The documentation (Link) mentions bearing as a value that is pulled, but the table further down in the documentation is missing the bearing value. And after testing the connector, I can verify that bearing does not appear on data from the feed. It does work on the DeviceStatusInfo feed though. It would be great if there was parity between the two. My department staff rely on the bearing value for vehicle orientation on maps.
... View more
2 weeks ago
|
0
|
0
|
83
|
|
POST
|
Is anyone having issues with the Geotab Feed for Velocity for ArcGIS Online? We started having issues with ours on Friday & Saturday. If I start the feed, it pulls in the initial set of data fine, but then each request for updates after that is returning the following error: "Unable to parse DeviceStatusInfo data." I have a ticket in with Esri, but no word back yet. Was curious if others are running into this as well. My Geotab admin says nothing changed with our configuration last week, so not sure what's changed. It seems like something changed in how Esri's connector is parsing the Geotab DeviceStatusInfo data.
... View more
2 weeks ago
|
0
|
1
|
151
|
|
BLOG
|
Glad to see some progress in this space for symbology. Is this setting saved to the original item, or is it on some kind of Hub "twin" of the original item? I'm thinking of situations where a layer is shared to to more than one hub's content library. And how can I scan my item's json to find item that have this setting applied or not applied? This is getting to @BrandonGuo 's comment. I'd prefer to be able to set the original style option at the Site level, or at least be able to set it as the default at the site level. I created a new Idea for this here: Ability to set ArcGIS Hub layer symbology and styl... - Esri Community So please go upvote it if you would find this useful.
... View more
11-19-2025
07:05 AM
|
0
|
0
|
1212
|
|
IDEA
|
As of Oct 2025, ArcGIS Hub allows site admins to set choose how a layer is symbololized in Hub at the layer level. This means if an organization prefers to use their own symbology across multiple sites, potentially with hundreds of layers, they will need to visit each layer in each Hub content library and change this setting manually. Please add one of the following options at the Hub Site level to make it easier to set the original symbology as the default. Option 1) Have one setting at the site level to choose between Hub symbology (the blue) or Original symbology. I guess in this scenario there will be confusion with the layer level setting, so maybe that should be disabled if the site admin elects to have native symbology Option 2) Add a setting at the site level that defines the default option to one of the two choices (Hub or Original). Any layers added to the site after the setting is applied get that default option applied. This means owners of existing sites would still have to go through all their sites and manually update their layers. Option 3) Add a button at the site level that allows the site admin to run a one-time process to set all layers in the Hub content to one of the options. I think Option 2 and 3 are probably the easiest to implement given there is a layer level setting. And I think both could actually be implemented together and be great compliments to each other.
... View more
11-19-2025
07:03 AM
|
7
|
0
|
443
|
|
POST
|
Thank you Mike. That QueryDataElement REST endpoint does indeed return details about attribute rules for the underlying data. I can see the attribute rules being returned from some of my testing services where the attribute rules do appear in Pro when inspecting the service through the data design menu. However, I also ran it against some of services where Pro is not showing the attribute rules. For these the QueryDataElement endpoint returns this error: { "error": { "code": 400, "message": "Unable to complete operation.", "details": [ "Parser error: Some parameters could not be recognized. " ] } } So I assume that Pro is getting the same message, but now displaying it for the user. And still going to do a few more tests, but it seems that including a query layer amongst all the other typical feature classes and geodatabase registered standalone table in the service is causing this error to occur. Referencing a database view (either registered with the geodatabase or not) is probably going to be the workaround for this. I'll try to respond with the final results when we close the ticket.
... View more
07-30-2025
07:08 AM
|
0
|
0
|
1511
|
|
POST
|
Thanks Mike. I'm in the middle of troubleshooting with support, as the issue seems intermittent. I'm going to be rebuilding things today and verifying as I go. I think the rules are eventually firing after a row is saved or committed in the standalone table in my Pro project. I have noticed this other related bug related to the service. I have a geodatabase referenced feature service that I've been publishing and republishing as I build out my schema for editing. So I've been writing attribute rule code, then publishing, testing, and republishing as needed. Once you publish the service, and load it into a Pro map contents, you can right click a layer in the service and see the "Data Design" menu and choose "Attribute Rules". Sometimes this will show the attribute rules on the feature class/table referenced by that layer, but sometimes it is blank...even though I know there is an attribute rule on the table. This is what led me to think something is being dropped when the service is published. Also, the reason I was thinking something gets published to the service is because of the "Exclude from Application Evaluation feature. Hussein explains it pretty well in this blog article I just found this morning linked below. In order for the client application to be able to fire attribute rules client-side it needs to know what they are...and so I assumed they are defined in the service definition REST somewhere. Either that or somehow the service is making a special call to the geodatabase to retrieve the rules and send them to the user's Pro client at some point prior to edits being made. Attribute Rules - Exclude from Application Evaluat... - Esri Community
... View more
07-28-2025
07:04 AM
|
0
|
1
|
1530
|
|
POST
|
I am trying to troubleshoot an issue with editing data in an enterprise geodatabase through a feature service in ArcGIS Enterprise 11.3 Does anyone know if information about attribute rules on feature classes is put into a feature service when you publish a reference service to Enterprise? And if so, where can I inspect that attribute rule definition in the service's JSON? I'm asking because I've run into an issue where if I edit the feature class with a calculation attribute rule applied, it will work if I edit the feature class directly from Pro. But if I make the same edit through the service referencing the feature class, the attribute rule doesn't fire. This made me think that somehow the service is not aware of the attribute rule, so I wanted to verify.
... View more
07-22-2025
10:54 AM
|
0
|
6
|
1609
|
|
IDEA
|
In ArcMap, you could edit data in tables or feature classes that were not registered with an enterprise geodatabase. Thad made it convenient to make small edits to ancillary tables in the database when needed rather than having to open up a separate tool to make the edits like SSMS, Oracle SQL Developer or PGAdmin. This capability was never added to Pro. Instead when you try to edit a non-registered table standard table all the editing features are greyed out and you get a warning "This enterprise database table is not registered with the geodatabase. Edits cannot be made" I would like to see this ability added to Pro so it works like it did in ArcMap. I know this has it's limitations, like having to start an edit session to make edits, click save to commit, and lack of undo functions), but sometimes you just want to use Pro to do this kind of work instead of opening separate applications. I'll also say that just registering these other tables everything with the database isn't always an option...for example in SQL Server you can't register a table that uses an identity column as it's primary key....which are much less of a pain then managing sequences. I'll also clarify that I think this would apply to tables or feature classes in a relational database that has been configured as an Esri Enterprise Geodatabase, but just aren't registered for specific business needs. Or it could also be tables of feature classes in a relational database that is not configured as an Esri Enterprise Geodatabase, such as a SQL Server or Postgres database with some tables with geometry columns.
... View more
06-16-2025
02:17 PM
|
1
|
0
|
460
|
|
POST
|
I just published my first hosted image layer to ArcGIS Online. It is a "tiled imagery layer" comprised of just one raster. I wanted to check how much storage it is taking up in AGOL (and thus how many credits it uses for storage daily), but the size section of the item page says 0KB. I even tried running the item report as an AGOL admin and the item's file storage and feature storage values are zero. The source raster I uploaded is a 20MB TIF with LZW compression. So I'd imagine it's AGO counterpart would be at least that size or larger due to the tiling. Has anyone had experience finding the size information for these imagery item types? Thank you for any information you can provide, -Andrew
... View more
03-19-2025
11:47 AM
|
0
|
0
|
718
|
|
IDEA
|
Please add the ability for ArcGIS Online layers to be displayed as Hub content using their default symbology. Currently all hosted feature layers displayed in a Hub site's content is given a default blue symbology. Many GIS staff create default symbology for ArcGIS Online layers especially when for data that relies on some categorization to make sense. Of course someone can change that default after they add it to a map or download it, but often there is a common symbology for a layer that staff in an organization like to use as a "one size fits most" when adding data to applications. It would be helpful to have these datasets displayed to the public in the same way, rather than the symbology going back to generic colors. For some datasets this simple color scheme makes it hard to communicate the value of a particular dataset to the public when they first glance at the data from a Hub content explore page.
... View more
10-11-2024
12:58 PM
|
17
|
4
|
1803
|
|
IDEA
|
Please add an option in Real-time Analytics that allows an input data source from a hosted feature layer to auto-refresh on a schedule. In my example. I have a feed of data from a data source. I then need to join on information from a hosted feature layer. The hosted feature layer gets updated once every night from an ETL operation. This community article (Link) confirms that the real-time analytics only uses data in the source input when the analytic is started... meaning the data could be days, weeks, or months old if the analytic is stable and never needs to be restarted. I'd prefer not to have to log in and manually restart the analytic daily for the join table info to update It would be great if there was an extra option on the Source -> Feature Layer configuration window that would allow the user to define a refresh interval in hours or days. Going one step further, I think some feature layers have a property that says the last timestamp when the data was refreshed, so it might be possible for the Analytics to ping that timestamp on an interval and only ingest the refreshed data when it determines the source has changed. Thank you
... View more
09-20-2024
09:19 AM
|
5
|
1
|
1024
|
|
POST
|
Hello, I'm relatively new to attribute rules, and I'm currently exploring options for using batch calculation as an alternative to immediate calculation to speed up some bulk processing after large nightly data imports to my data. I'm trying to create a batch calculation attribute rule, and I only need it to run on a subset of features in a feature class. I'm curious if there is a way to apply the filter on the attribute rule so it only runs on features that meet the conditions in a SQL whereclause. My hope is this will speed the batch calculation run time. So, for example I'd like a batch calc attribute rule that does something like this: For Bulk Calc Attribute Rule A, only run it against rows in Feature Class B that match SQL WHERECLAUSE C. For records that match the whereclause, update one attribute value on the row. All other rows on the feature class are ignored. I guess the simple way to do this is to put an IIF() statement inside the Attribute rule. But this means the attribute rules will evaluate every row in the feature class. I already know this one attribute rule would only apply to features with specific values, which would drop the number of rows to assess from 100k+ to only 100 or so. I guess there is a way to use a "FILTER()" function, but I haven't seen an example of sticking that in an attribute rule.
... View more
01-25-2024
08:55 AM
|
0
|
0
|
1238
|
|
POST
|
Hello, I'm relatively new to attribute rules. I'm trying to build them into a new schema I'm developing to handle primary key population with a sequence using the NextSequenceValue() function. I've noticed that just by adding this one simple attribute rule, the time to bulk import features to the feature class slows significantly. For my test I used a sample line dataset of 1,000 rows. When importing data through the Append tool, with the rule disabled, the data is loaded in seconds. When the sequence attribute rule is enabled, it takes about 1.5 minutes. And this is being done from Pro on a machine at the same datacenter as the Oracle geodatabase I'm loading into. 1.5 minutes is okay, but that's just for 1,000 rows. With this schema I'm building, one major workflow is loading data into the schema from a 3rd party, and some of the 1:M tables related to the line feature class could get 1,000's of rows each every day. I'm worried the load time will be so long with this attribute rule, that I'm going to hog our ETL scripting server all night with just this one job. Same thing happens with loading the data via FME I'm wondering if I should fall back to alternative ways for generating primary keys outside of the Esri ecosystem. For example, creating a trigger at the database level to grab sequence values on row inserts. Or running a query in the ETL tool to grab sequence values from the database upstream from the geodatabase import. These feature classes won't be versioned or archive enabled, so I don't think the trigger population would be an issue. Although attribute rules are great for managing some key attributes for transactional edits done by analysts throughout the day, they seem to really slow down bulk loads. I'm curious if folks out there have created some mitigation strategies for speeding up nightly bulk loads while keeping your attribute rules intact or daily edits.
... View more
01-23-2024
07:31 AM
|
0
|
0
|
1273
|
|
IDEA
|
The implemented enhancements in ArcPy.Describe for Pro 3.2 don't return a dateModified property for feature classes in my 10.9.1 Oracle geodatabase. Nor does it work for feature classes in my 10.9.1 Azure SQL Database. It does for a file geodatabase feature class though. So really the enhancement is Arcpy describe will return size and dateModified if the underlying geodatabse supports it... and enterprise geodatabases (at least at 10.9.1) don't support it. I assume this is because the SDE business tables don't contain any kind of auditing fields for date created or date modified for geodatabase items...because I've hunted for something like this and no luck. I'd be curious if the value would be returned by ArcPy if the Oracle geodatabase was at version 11.2. But I assume those columns would have to be added to the SDE business schema somewhere and populated to make the data available for Arcpy to grab. I could not find any documentation specific to what's new in enterprise geodatabase functionality at 11.2 so can't verify this was done.
... View more
12-19-2023
07:48 AM
|
0
|
0
|
4443
|
| Title | Kudos | Posted |
|---|---|---|
| 7 | 11-19-2025 07:03 AM | |
| 1 | 06-16-2025 02:17 PM | |
| 17 | 10-11-2024 12:58 PM | |
| 5 | 09-20-2024 09:19 AM | |
| 4 | 07-28-2022 08:55 AM |
| Online Status |
Offline
|
| Date Last Visited |
a week ago
|