POST
|
In case anyone else runs into a similar problem and comes across this post, we found a way to get the calculation to work without changing the JS function. To prevent the pulldata calculation from calculating with an error when the input fields hadn't been filled out yet, we set that field to only be relevant if the final field used in the calculation has a value in it using the following statement in the 'relevant' column of the XLSForm: string-length(${dms_string})!=0 With this change, the pulldata calculation doesn't run until the necessary fields are filled out, preventing the error from appearing and allowing us to publish the survey!
... View more
05-09-2022
09:36 AM
|
0
|
0
|
1898
|
POST
|
Thanks for the reply @DougBrowning! Unfortunately, I had updated that setting for the survey, but because we are trying to use a Javascript function to convert a list of DMS coordinates like this 36-56.00N 023-29.00E;36-42.00N 023-32.00E;36-41.00N 023-48.00E;36-47.00N 023-55.00E to a list of DD coordinates like this 36.93333 23.48333;36.70000 23.53333;36.68333 23.80000;36.78333 23.91667 this setting doesn't resolve the error returned by the pulldata calculation.
... View more
05-02-2022
05:33 AM
|
0
|
0
|
1910
|
POST
|
We have written a custom Javascript function to parse lists of coordinates defining a polygon from several websites. The function parses out the parts of DMS coordinates and converts them to DD coordinates. They are then re-concatenated into a semicolon-separated list of space-separated coordinate pairs, which we would like to input as a calculation for a geoshape field. The function has been tested in both a browser console log and in the Scripts tab of Survey123 connect and returns the expected list with no errors, however, when the pulldata call of pulldata("@javascript", "convertCoords.js", "convertCoordString", ${dms_string}, ${country}) is put into the calculation cell for the polygon field, it defaulty returns an error: @javascript error:TypeError: Cannot read property 'split' of undefined in convertCoords.js:convertCoordString Once you enter a value into both of the fields used in the function and hit 'enter', the function calculates the expected string. But, it seems that the error is causing an XLSX error that won't let the survey be saved when the resulting field is placed in the calculation field for the geoshape question. I've attached the Javascript function and samples of each DMS coordinate string (${dms_string}) form and its corresponding ${country} values are included in the comments section at the top of the script file. Is there something that can be changed in the script so that it doesn't defaultly calculate an error in the Survey123 and consequently calculates the geoshape quesiton?
... View more
04-29-2022
10:42 AM
|
0
|
3
|
1971
|
POST
|
I did not notice that while scouring the page - thanks @KimberlyGarbade ! I will update that and try the valueOf function instead today!
... View more
03-10-2022
04:22 AM
|
0
|
0
|
772
|
POST
|
We have a service that is updated via a Survey123 form with attachments enabled. For security, the original Hosted Feature Layer that is updated by the survey has these editing settings: There are then two corresponding view layers, one with full editing enabled for internal data cleaning, and one with editing disabled for display in a map. I have users reporting that they are adding attachments to the points they are submitting through the Survey123, but I cannot see them in either the hosted feature layer or corresponding view layers, despite the original layer saying attachments are enabled and the hide attachments options being visible on the view layers. I discovered this was the case when trying to add the attachment images to pop-ups using this workflow, https://community.esri.com/t5/arcgis-online-blog/show-attachments-in-pop-ups-with-arcade/ba-p/890588, but quickly discovered that in the Arcade window, the Attachments function said it was "not available". Has anyone else encountered this problem before and/or know how we might fix it? I'm hoping I just missed something easy! (We're running ArcGIS Enterprise 10.7.1 on Linux)
... View more
03-09-2022
11:54 AM
|
0
|
0
|
707
|
POST
|
We are trying to use a Field Calculator Processor to create a new ID field that concatenates a simple 2 character text string with a string version of the ObjectID field. The idea is that this new ID field would look something like XX125 where 'XX' is the string and '125' is the ObjectID value. To achieve this in ArcGIS Pro, we used the Calculate Field tool with this Python 3 expression: 'XX'+str(!objectid!) The goal is to have the Field Calculator processor generate this value for new submissions to a dataset as they pass through GeoEvent, which is calculating some other fields as well. Here are the Expressions we've tried in the Field Calculator processor. In GeoEvent the objectID field is interpreted as an integer. 'XX'+toString(objectid)
'XX'+objectid
concat('XX', toString(objectid))
concat('XX', objectid) We've also tried setting up Field Mapper processors on either side of the Field Calculator processor to change the ObjectID field from an integer to a string type, then tried both of the expressions here: 'XX'+objectid
concat('XX', objectid) It is unclear to me why neither of these methods (or any of the expressions listed) are working, has anyone else had similar trouble or know what might be wrong? (We're running ArcGIS Enterprise (including GeoEvent Server) 10.7.1 on Linux)
... View more
03-09-2022
11:31 AM
|
0
|
2
|
810
|
POST
|
We are looking for ways to share some of our hosted feature layers with another non-Esri GIS system. We know they can ingest WMS/WFS OGC services and that we can secure these services and authenticate using tokens. But we would like to look into a more secure option for these services; is there anyway other than enabling LDAP and PKI (https://enterprise.arcgis.com/en/portal/10.7/administer/linux/use-ldap-and-pki-to-secure-access-to-your-portal.htm) to secure services with certs or keys? Would it be possible to keep using built-in Portal accounts as the primary form of authentication and enable LDAP/PKI for a few use cases? Has anyone else tried anything like this before? We are running ArcGIS Enterprise 10.7.1 on Linux.
... View more
01-18-2022
11:55 AM
|
1
|
1
|
2336
|
POST
|
Have you double-checked that the layer is editable? I've come across a similar error for hosted feature layers I forgot to make editable. I've also found that sometimes you have to add fields through the Portal item details page instead of Pro (at least for hosted feature layers), unfortunately. This error page might help you pinpoint the problem too - https://pro.arcgis.com/en/pro-app/latest/tool-reference/tool-errors-and-warnings/001001-010000/tool-errors-and-warnings-00851-00875-000852.htm
... View more
11-16-2021
05:40 AM
|
0
|
1
|
756
|
POST
|
It was recently brought to our attention that the default date options in a date Filter widget aren't automatically applying. To elaborate, we have many maps with a Filter widget added. To prevent confusion for non-GIS savvy users, we generally set these filters to be automatically on, so when they change any available filter options, the filter is automatically applied. For most of these filters, they have several possible queries based on several fields, that ask users for values before filtering. For date fields, we usually use the "between" option, and noticed recently that the default options of "Yesterday", "Today", and "Tomorrow" don't seem to apply automatically. For example, if you set the range to 10/1/21 to "Today", nothing happens, but if you change the range to 10/1/21 to 10/7/21, the data automatically filters. We also noticed that if you turn the whole filter off and back on, the "Yesterday", "Today", and "Tomorrow" options do apply, it only doesn't work when the filter is on before the date range is set. Has anyone else seen behavior like this? Is this expected behavior? Is there anything we can do to fix it? We are running ArcGIS Enterprise 10.7.1 on Linux.
... View more
10-07-2021
11:14 AM
|
0
|
1
|
881
|
POST
|
Hi @DanWade - thanks for your response! What you mention about that field being an array makes a lot of sense. So using the logic that this field is an array, I allowed GeoEvent to re-create the definition, but it continued to read that field as a string. I then tried a couple things to try and get the array to read in as such: 1. I reordered the features in the geojson to put one with multiple items in the lbl_list array first, so it looks more like "lbl_list": [ "label 1","label 2" ], and not "lbl_list": [ null ], However, it still seems to interpret this as a string. 2. My next test was to change the field in the definition manually to a group attribute and add five string "values" fields beneath that (a quick scan of the geojson in question showed this to be the longest label list I could find). After doing this, I adjusted the GeoEvent Service to map all those values out as "lbl_lilst.value1" (or similar). I then tried two things here, having those values write directly out into separate fields, and adding a field calculator that should aggregate those fields into one string field. Neither seems to have worked. I opened an Esri Support Case in case this is unexpected behavior, but I am happy to consider any other insights you might have to help with troubleshooting.
... View more
08-20-2021
04:42 AM
|
0
|
1
|
1188
|
POST
|
@JonEmch I did what you suggested and started an Esri Support case for this problem earlier today and DM'd you the case number. Thanks for reaching out!
... View more
08-16-2021
11:24 AM
|
1
|
0
|
805
|
POST
|
We have a workflow set up to read in new records to a feature layer on our Portal and send an email notification to a group of users monitoring that dataset. We are running ArcGIS Enterprise 10.7.1 on all components, Server, Portal, and GeoEvent Server. Last week, I attempted to update this process to send emails to different groups of emails based on an attribute in the dataset. However, I kept running into a problem that no matter where I moved the filter in the GeoEvent service processing, it proved to be a stopping point for the processing. Here is the desired workflow: Based on the documentation, https://enterprise.arcgis.com/en/geoevent/latest/analyze/attribute-filters.htm, what we have set up should work, but since the documentation doesn't go back to 10.7, I'm wondering if there might be something I'm missing. I tried many combinations of things, but always ended up with the filter as a block: Field as a string field, tried the filter with quotes and no quotes around the text, as well as trying the field value and alias values. Field as integer field, using the integer field values I also tried using "=" and "MATCHES", with the resulting value as a regular expression. We have a filter in another GeoEvent service that uses a regular expression and seems to work as expected, and I'm stuck as to why we might be having this problem with this particular feature service. Has anyone else experienced similar problems or know what might be wrong? Thanks in advance! We have currently found a work-around for this problem using Query Definitions on 3 separate inputs, but would like to keep things cleaner with just one input and filters if possible.
... View more
08-16-2021
11:20 AM
|
0
|
0
|
579
|
POST
|
We are currently using GeoEvent to regularly pull in a .geojson file from an external source. In this GeoJSON, there is a field that contains several strings, contained in brackets, it looks like this: "lbl_list": [
"label 1","label 2"
], Previously, the contents of the square brackets was written to a string field in the output feature layer, but since we finalized our processing workflow and published a new feature layer from GeoEvent server (using the dialog box in the Output Connector), that field is remaining empty. I have double checked the field length, and as most features don't contain more than one or two labels, the length of 1024 seems more than adequate to contain all characters. I also checked all our Field Mapper processors to make sure that the field is mapped as expected, and I confirmed that it is. Has anyone run into a similar problem and found a work-around? Or know what might be wrong in our workflow? We are currently running GeoEvent Server 10.7.1 on Linux.
... View more
08-11-2021
06:03 AM
|
0
|
3
|
1327
|
POST
|
@RichSikoThanks for your insight! That's essentially what I had to do - republish the layer and recreate the view. Your post made me remember that I had a similar problem with another layer a few months back, and then remembered that both of those feature layers had been published from Pro. This seems to be the problem in my case, publishing a hosted feature layer from ArcGIS Pro to Portal. In this case, I used the existing feature layer as a template to create a new Hosted Feature Layer in Portal, and then appended the data in the first layer to the second in ArcGIS Pro. Once I did this, all views created worked as expected!
... View more
06-17-2021
08:01 AM
|
0
|
0
|
3137
|
POST
|
For those who may come upon this post, here is how we've finally resolved this problem. When downloaded from the source, the GeoJSON in question is about 28 million characters on one line. We learned through a support ticket with Esri that GeoEvent does not process one-line files well and that there is an existing bug about GeoEvent having trouble reading exceptionally large files (in terms of characters) - BUG-000124967. With these two things in mind, as well as the help received from @EricIronside above (thanks again!), we followed these steps to pre-process the GeoJSON before letting GeoEvent read it: Processed the file to place each feature on a line, bringing the total number of lines to around 2,000 We also considered breaking the file into even more lines, but ended up not needing to do this. Make sure all polygons follow the right-hand rule (originally, some did, but not all) Use mapshaper to simplify the polygons to 30% of previous vertices (this reduces the number of characters significantly, as one extremely detailed feature goes from 219,000 vertices to about 70,000) A note: for our use, simplifying the polygons to this degree does not impact the usefulness of the data. You may want to consider different simplification parameters if the accuracy of your data needs to be high. I also changed the number of lines per batch in the "Watch a Folder for new GeoJSON Files" input connector from the default to 2 (due to the length of each line), with a wait time of 500 milliseconds (the default), which slowly and methodically reads in all approx. 1700 features, 4 per second. Since we are updating this data daily, this longer processing time works for us, but this process may cause problems if you need to update the data on a much shorter timescale.
... View more
06-17-2021
07:47 AM
|
0
|
0
|
4176
|
Title | Kudos | Posted |
---|---|---|
1 | 01-18-2022 11:55 AM | |
1 | 08-16-2021 11:24 AM | |
7 | 08-17-2020 11:03 AM | |
1 | 08-11-2020 12:19 PM |
Online Status |
Offline
|
Date Last Visited |
10-26-2022
01:09 PM
|