POST
|
Hello @NarayanHamde , To better understand what’s happening, could you clarify a couple of things: What version of ArcGIS Enterprise are you using? How large are the multipatch layers or the SLPKs you’re working with (in GB)? When you say some layers fail with waittimeout, does this show up in the ArcGIS Server logs at the SEVERE or WARNING level? It would be great, if can share the below details as well: Server Logs – Set logs to DEBUG temporarily and review what happens at the moment publishing fails. This can confirm if it’s a timeout, memory, or cache issue.(Be sure to switch back to SEVERE or WARNING afterward, since keeping logs at DEBUG can quickly consume storage.) Machine Resources – Large 3D publishing jobs can require a lot of CPU/RAM. Do you know how many cores and memory your ArcGIS Server site has available? Sharing these details will really help in narrowing down the root cause of the issue.
... View more
|
0
|
0
|
46
|
POST
|
Hello @DanNarsavage_IDWR I was thinking more along the privileges of the user to whom you are trying to assign this item like is that a "Publisher" or "Administrator" of any custom role? But, while looking, I found the below. So it's submitted as a BUG: BUG-000178457 : Managing datastore items returns the error message "Can't change owner on item in the Enterprise portal". https://support.esri.com/en-us/bug/managing-datastore-items-returns-the-error-message-cant-bug-000178457 Hope it helps!
... View more
3 weeks ago
|
1
|
2
|
371
|
POST
|
Hello @vaishalikulkarni00 , I wanted to provide a quick update. I have not been able to identify any viable workarounds for this issue so far. However, I’m continuing to explore possible solutions. I’ll keep you informed of any progress.
... View more
04-23-2025
10:02 AM
|
0
|
1
|
642
|
POST
|
@vaishalikulkarni00 , It does seem to be a different behavior than expected. However, I think that the configuration being on the Admin page is correct and the documentation for 11.1 does say the same, kindly find the details below: Create a geoprocessing service webhook: https://enterprise.arcgis.com/en/portal/11.1/administer/windows/create-webhooks.htm#ESRI_SECTION1_A94ED6E140A4426A8EC72C9A03786C5F Create a feature service webhook: https://enterprise.arcgis.com/en/portal/11.1/administer/windows/create-webhooks.htm#ESRI_SECTION1_A3A3B5AE99F446C7B48ABE051E79F0C6 Where as the documentation for 11.4 suggests different workflow: Configure service webhooks: https://enterprise.arcgis.com/en/portal/latest/administer/windows/create-webhooks.htm#ESRI_SECTION1_A47832EBBC414E7C8250FFB95CCF000E My guess is that with new releases the behavior has been made user friendly as expected. Additionally, I think we should not be looking for "Create Webhook" option on the REST Admin API. I would try to look for a way which can allow you to create a Webhook with the custom role configuration and keep you posted if I find anything. Furthermore, I think reaching out to ESRI support at this point would also be a good idea. Hope it helps!
... View more
04-15-2025
12:09 PM
|
0
|
1
|
740
|
POST
|
Hello @vaishalikulkarni00 , As the Admin is able to create Webhook, I would also assume that the SD (Service Definition) is OK. However, let's try to cover all our bases and try the following: Try by updating some other capability as "Sync" would require other checks on the data configuration end, can you try disabling "editing" capability or pick any other example from the below document: https://developers.arcgis.com/rest/services-reference/enterprise/update-definition-feature-service/ Try publishing a "Hosted Feature Service" with a very small data and try the Webhook creation workflow on that. Check the "Sharing" setting of the service (Shouldn't matter as you are the owner but still a check wouldn't do harm) Check how the "Custom Role" was created as I use the default publisher role as base and then just add "webhooks" capability (as shown in the video) maybe some required capability is missing from the role. In the meantime, I will try to search more on this and share details. Hope it helps!
... View more
04-15-2025
10:07 AM
|
0
|
1
|
754
|
POST
|
Hello @vaishalikulkarni00, I ran some tests on my side by configuring a similar role on version 11.1 and 11.3 (with and without Azure AD config) both and followed the workflow mentioned in the document below: Create (Webhooks) | ArcGIS REST APIs | ArcGIS Developers Key point being update the URL to Rest Admin API, I am sharing a sample below: REST API URL: https://sampleserver6.arcgisonline.com/arcgis/rest/services/CommercialDamageAssessment/FeatureServer ArcGIS REST Admin URL (added the bold section) https://sampleserver6.arcgisonline.com/arcgis/rest/admin/services/CommercialDamageAssessment/FeatureServer The above URL should allow you to reach to "Update Definition" option and then follow the documentation mentioned above. Furthermore, I also noticed the following: If you are owner of the "Feature Layer" (Hosted and Referenced both): No redirect occurs If the item is owned by another user (Hosted and Referenced both): Redirect occurs and tries to generate token Hence, I think that the redirect has nothing to do with the "Azure AD" configuration. It is to validate the access of the item (assumption). Furthermore, can you try taking the ownership of the item or try doing the workflow on an item owned by your username? Please let me know how it goes, in the meantime I will try to find any other workaround. Hope it help!
... View more
04-15-2025
02:09 AM
|
0
|
1
|
772
|
POST
|
Hello @vaishalikulkarni00 , I assume that your assigned role is a built-in role, not a custom one, and by default, the Webhooks capability is only available to the Administrator role. Please refer to the documentation below for more details: Full Document: Privileges granted to roles—Portal for ArcGIS | Documentation for ArcGIS Enterprise Webhooks Section (Geoprocessing): https://enterprise.arcgis.com/en/portal/11.1/administer/windows/privileges-for-roles-orgs.htm#ESRI_SECTION1_C30D73392D964D51A8B606128A8A6E8F:~:text=is%20turned%20off.-,Webhooks,-Geoprocessing Webhooks Section (Organization webhooks) : https://enterprise.arcgis.com/en/portal/11.1/administer/windows/privileges-for-roles-orgs.htm#ESRI_SECTION1_C30D73392D964D51A8B606128A8A6E8F:~:text=Organization%20webhooks Depending on the type of webhook you want to create, I recommend configuring a custom role, as assigning the default Administrator role may not be feasible for the Organization Admin. For Geoprocessing service webhooks: assign the following privileges to the custom role. For Organization webhooks: assign the following privileges to the custom role. Hope it helps!
... View more
04-10-2025
04:42 AM
|
0
|
0
|
834
|
POST
|
Hello @CarstenHogertz , As per my understanding, we should use "<ipVM1>:6443/arcgis" as utilized for registering the first ArcGIS Data Store machine as it is recommended in the documentation. For more details: https://enterprise.arcgis.com/en/portal/latest/administer/windows/add-standby-machine.htm#ESRI_SECTION1_91C0037FB8454EAFBB298CD4A0A07356:~:text=Use%20the%20same%20GIS%20Server%20site%20as%20you%20did%20when%20configuring%20the%20other%20machine%20or%20machines%20in%20the%20same%20data%20store%20for%20this%20ArcGIS%20Enterprise%20deployment. Hope it helps!
... View more
03-20-2025
01:23 AM
|
0
|
0
|
748
|
POST
|
Hello @JohnDeLucio1, It seems that the "Network Analysis" service is not able to find some files within the "ArcToolbox" which allows ArcGIS Server to execute these tools. However, prior to getting into the details of logs and workflows on the application side, can you confirm the following: Detailed workflow on how the service was published. If it's possible for you to try and publish a test "Network Analysis" using the steps mentioned in below document: Publish Routing Services utility—ArcGIS Server | Documentation for ArcGIS Enterprise Hope it helps!
... View more
03-02-2025
12:21 AM
|
0
|
0
|
379
|
POST
|
Hello @SaurabhUpadhyaya , Your question covers a broad range of factors and points. Capacity planning can get quite detailed, and there are many variables at play. I would suggest starting with a few key questions and refining the setup based on what we observe during testing. Instead of jumping straight into complex planning, we can start small, gather insights, and scale accordingly. If it would be up to me, I would start with the following key questions: How much computing power (CPU, RAM) we need on AWS for ArcGIS Server? How many users can work simultaneously without performance issues? How much load can PostgreSQL RDS handle efficiently? We can utilize the following to estimate and answer to the above queries ESRI’s Capacity Planning Tool (CPT) for predicting system requirements. As you mentioned AWS, "AWS Cloudwatch" to monitor ArcGIS Server and database performance. JMeter for simulating real-world load and identifying bottlenecks We would also need to figure out the following to size the system properly and to decide AWS instance size. How many requests does the system handle? i.e. "Request Volume" If 50 users interact with the system at once, And each user makes 20 requests in one session, That’s 1,000 requests per hour (Transactions Per Hour - TPH). What are the most resource-heavy operations (which you have as functionalities to application)? Feature Service: Displays ~2 million points. Search and Buffer: These queries hit the database the hardest. Export to PDF: Consumes a lot of CPU and memory. What can help in understanding ArcGIS Server logs → To see which requests take the longest which will allow us to identify slow requests PostgreSQL slow query logs → To find heavy database operations and allows us to pinpoint database bottlenecks JMeter load tests → Allows us to simulate concurrent users and test response time AWS CloudWatch→ Track CPU, memory, and database performance. You can also setup some "Optimization Strategies" beforehand ArcGIS Server Optimization Feature Service Performance Adjust Service Instances: Set Min 2, Max 5 instances, but monitor usage. As you may need to tweak these Enable Shared Instances: Reduces memory use for infrequent queries. Use Filtering: Only load relevant data (don’t send all 2M points to users). Would recommend using "Filtering" (as you mentioned that the application is based on ArcGIS JS API) (For more :Introduction to query and filter | Overview | ArcGIS Maps SDK for JavaScript 4.32 | Esri Developer" Search & Buffer Optimization Index spatial columns in PostgreSQL using GIST indexes for faster queries. Optimize queries: Use SQL EXPLAIN ANALYZE to check query efficiency. Export to PDF Run exports in a separate ArcGIS instance to avoid slowing down others. Use asynchronous processing to handle large PDF exports. AWS Infrastructure Optimization Concurrent Users Suggested EC2 Instance 50 Users `m6i.xlarge` (4 vCPU, 16GB RAM) 100 Users `m6i.2xlarge` (8 vCPU, 32GB RAM) 200+ Users `m6i.4xlarge` (16 vCPU, 64GB RAM) Cautiously monitor all the traffic by using AWS CloudWatch to track CPU and memory usage. If CPU is consistently over 80%, increase EC2 size or add more servers. (Remember always start with the small instance type and then move up) PostgreSQL RDS Performance Use PgBouncer → A connection pooler to handle many users efficiently. For more: pgbouncer/README.md at master · pgbouncer/pgbouncer · GitHub Use Read Replicas → Separate heavy search queries from main transactions. For more: Working with read replicas for Amazon RDS for PostgreSQL - Amazon Relational Database Service Monitor Queries → Check slow query logs to fix bottlenecks. For more: https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_LogAccess.Concepts.PostgreSQL.Query_Logging.html#USER_LogAccess.Concepts.PostgreSQL.Query_Logging.using Testing and Scaling (use these as suggestions): Baseline Testing Deploy a basic setup. Use ArcGIS Server logs and CloudWatch to measure usage. Load Testing Use JMeter to simulate 50, 100, and 200 users. Measure response time for searches, feature loading, and PDF exports. For more details on utilization of JMeter: Performance Testing with Apache JMeter (An Introdu... - Esri Community Scaling Plan If CPU is above 80%, increase EC2 size. If database is slow, upgrade RDS or add read replicas. If requests queue up, increase ArcGIS Server instances. Furthermore, these are just baseline recommendations based on standard performance expectations. The actual system behavior depends on multiple factors like data complexity, network latency, and real-world user interaction. I would suggest starting with these guidelines, monitoring system performance, and then adjusting based on real findings. Additionally, let me know your thoughts on this or if you want to dive into any specific concerns Hope it helps!
... View more
03-02-2025
12:05 AM
|
1
|
0
|
432
|
POST
|
Hello @Brian_McLeer , Hope you are keeping safe and well. As per my understanding of things. The "Anonymous Access" should not have any impact on the Basemap Gallery unless one of the items in the Basemap group is not shred properly. I think you may have already checked these but let's just cover our bases before reaching out to ESRI Support: Ensure that the group (which is set in Basemap Gallery) is viewable by Everyone (Public) or by All organization members. Ensure that each basemap which you want to see in the "Basemap Gallery" is shared with "Everyone" and the "Basemap Group" (may have to check each item) It seems to me that one or more item in the "Basemap group" may not be available to the user/users who are trying to access it. Probably best explained by diagram below (based on what I have worked and understood from testing) It would also be great if you can check/confirm the following as well: What is the version of ArcGIS Enterprise you are working with? (Will help me test this on my end) Is there any error in Portal for ArcGIS Logs? Is there any error you can see in Developer tools of the browser? Hope it helps!
... View more
12-31-2024
12:01 AM
|
1
|
1
|
667
|
POST
|
Hello @ESRI_User_121 , This is certainly a challenging situation. However, I understand that switching browsers across an organization or upgrading the environment to a newer version is not a decision that can be taken lightly. It is also unlikely to be an easy task to accomplish in a short period of time. As an alternative, you could consider leveraging the `MutationEventsEnabled` property as mentioned in the following article, which @RyanUthoff mentioned: https://support.esri.com/en-us/knowledge-base/impact-of-google-chrome-and-microsoft-edge-127-updates--000032813#:~:text=Google%20Chrome%20and%20Microsoft%20Edge%20both,a%20new%20version%20of%20ArcGIS%20Enterprise For more reference utilize the below: For Testing: Edge: Microsoft Edge Browser Policy Documentation | Microsoft Learn In Edge, go to edge://flags, search for "mutation", and set Enable (deprecated) synchronous mutation events to Enabled. Chrome: Chrome Enterprise Policy List & Management | Documentation In Chrome, go to chrome://flags, search for "mutation", and set Enable (deprecated) synchronous mutation events to Enabled. If the above solution works, it seems that applying this setting through a group policy would be a simpler approach compared to switching browsers or upgrading the environment. This method would minimize disruption and not require any additional effort from end-users. Furthermore, I can look into finding the steps to deploy this as a Group Policy and share them with you. However, I would strongly recommend reaching out to your IT team for assistance with this process. That said, if you still need further details or clarification, please feel free to let me know, and I will do my best to provide the necessary steps. Hope it helps!
... View more
12-29-2024
10:11 PM
|
0
|
0
|
8046
|
POST
|
Hello @gisarchitect I am glad I could clarify the concerns. I would be happy to answer this or try to take a shot at it. As per my understanding, ArcGIS Server uses connection pooling to manage database connections efficiently and would depend on the "Connection Pooling and Instances settings " for the service and would also take in consideration the database settings. For service instance settings: Dedicated : Each map or feature service can have a minimum and maximum number of instances, which should effectively translates into database connections. (but I think it does reuse the connections as well if too many requests hit the service one after other) Shared : The number of shared instances can be set to optimize resource use across multiple services. It is often recommended to have the number of shared instances align with the number of CPU cores available. In addition to above, The actual maximum pool size can be influenced by your database configuration from SQL Server , where connection strings might define a maximum pool size limit. Furthermore, I think it a difficult task to get this information by querying some where. I think it would require you to create a load testing scenario, which I can try to think of but would require tools like Jmeter etc. I will try to come up with some scenarios and share that with you. Apologies for the delay in response, was a bit occupied with things. Hope it helps!
... View more
08-06-2024
03:24 AM
|
2
|
1
|
2016
|
POST
|
Hello @AnkitaGawai , This is an interesting question, and I'll try to address some key concerns I would have. Based on my understanding, you can upgrade the environment according to your needs, depending on the requirements of the method you choose. However, Esri recommends sticking to the original deployment method. Using ArcGIS Enterprise Builder allows you to upgrade everything at once, whereas switching to a different method would require you to upgrade each component individually or would depends on the chosen method. Additionally, once you transition away from the ArcGIS Enterprise Builder deployment process, reverting to that method may not be possible. It's important to consider the complications of using other methods, such as Chef. Not all deployment methods are suitable for every scenario. I think, ArcGIS Enterprise Builder is ideal for rapid, simplified deployment of a base ArcGIS Enterprise configuration (ArcGIS Server, Portal for ArcGIS, ArcGIS Data Store, and Web Adaptor) on a single machine. In contrast, Chef scripts are typically more beneficial in multi-node setups where configuration consistency and automation are critical. In a single-machine setup, the overhead of scripting an upgrade with Chef may outweigh the benefits. The above considerations focus on the differences and requirements of choosing a particular method. They do not take into account the manual steps you would need to take, the compatibility issues you might encounter, the management of downtime during the upgrade process, and any custom configurations you need to consider. I hope this helps you in making an informed decision.
... View more
08-06-2024
02:53 AM
|
1
|
0
|
700
|
POST
|
Hello @DominicReed, In addition to suggestions of @jcarlson , I would suggest you to try to use "FeatureSetByPortalItem"(for reference) to achieve the filtering functionality you desire, you might need to structure your Arcade expression a bit differently to create a dictionary that links the data to Hosted Feature service directly. Here's a suggested method, how you can approach it: Create a Dictionary with Codes: You need a dictionary where each specialty is a key, and the value is a list of health unit codes associated with that specialty. Use the Dictionary for Filtering: Use this dictionary to filter the point features based on the selected specialty. Here's a probable way to approach using Arcade to achieve this (I have made some assumption in regards to dictionary components, you would need to adjust that accordingly): // Access the portal and feature set
var portal = Portal('https://www.iede.rs.gov.br/portal');
var fs = FeatureSetByPortalItem(portal, "<Item_ID>", <Layer_ID>);
// Initialize a dictionary to store specialties and their corresponding health unit codes
var specialtyDict = {};
// Iterate over the features in the feature set
for (var feature in fs) {
var specialty = feature['DS_ESPECIALIDADE'];
var healthUnitCode = feature['CNES_ref'];
// Check if the specialty already exists in the dictionary
if (!HasKey(specialtyDict, specialty)) {
// If not, initialize an empty array for this specialty
specialtyDict[specialty] = [];
}
// Add the health unit code to the array for this specialty
Push(specialtyDict[specialty], healthUnitCode);
}
// Return the dictionary as a JSON object
return Text(specialtyDict); Description: FeatureSet Access: Access the feature set using `FeatureSetByPortalItem` and the specific table index. Dictionary Initialization: Create an empty dictionary to store specialties as keys and arrays of health unit codes as values. Iterate Over Features: Loop through each feature in the feature set, extracting the specialty and health unit code. Check and Add: Check if the specialty is already a key in the dictionary. If not, add it and initialize an array. Then, push the health unit code into the array. Return the Dictionary: Finally, return the dictionary as a JSON object, which can be used to filter the map points. Hope it helps!
... View more
08-06-2024
02:19 AM
|
1
|
1
|
1959
|
Title | Kudos | Posted |
---|---|---|
1 | 3 weeks ago | |
1 | 03-02-2025 12:05 AM | |
1 | 12-31-2024 12:01 AM | |
1 | 10-19-2023 12:24 AM | |
1 | 03-25-2024 11:42 PM |