|
POST
|
Xander Bakker, thank you. I am much further ahead, the only part the script appears to be failing on now is an "failed edit operation, Required keyword not defined in script". I wonder if it is the 'labeltext' element, which is actually a field that the result needs to be written to var tbl = FeatureSetByName($datastore,"DB.LAND.NonCustomerAddress"); var fc = FeatureSetByName($datastore,"DB.LAND.Premise"); var premiseKey = $feature["GlobalID"]; var nonCustomerSql = "GlobalID = '" + premiseKey + "'"; var nonCustomerAddressResult = Filter(tbl, nonCustomerSql); var txt = null; for (var address in nonCustomerAddressResult){ txt = txt + address.building_number + ','; premiseKey = address.premiseguid; } var premiseSQL = "GlobalID = '" + premiseKey + "'"; var premiseResult = Filter(fc, premiseSQL); return { "result" : txt, "edit" : [{ "className" : "DB.LAND.Premise", "updates" : [{ "labeltext" : txt }] }] }
... View more
07-23-2020
05:14 PM
|
0
|
10
|
5592
|
|
POST
|
Xander, Thank you ,I hadn't even noticed the logic error. I've updated the code here - for the popup - would I still use $map in the attribute rule? I wouldn't think so, but I could be wrong. Also, the $feature here is referencing the point I clicked, however in the attribute rule it's actually reversed as it would be from the table to the feature? As for the SQL, I am using it as I need to roll up all of the values from the table and then write them to the the feature in a comma separated list. var tbl = FeatureSetByName($map,"Non Customer Address"); var fc = FeatureSetByName($map,"Premise"); var premiseKey = $feature["GlobalID"]; var nonCustomerSql = "premiseguid = '" + premiseKey + "'"; var premiseSQL = "GlobalID = '" + premiseKey + "'"; var nonCustomerAddressResult = Filter(tbl, nonCustomerSql); var txt = null; var premiseResult = Filter(fc, premiseSQL); for (var address in nonCustomerAddressResult){ txt = txt + Text(address.building_number, ','); } If (count(txt) > 0){ premiseResult = Filter(fc, premiseSQL); premiseKey.labeltext = txt; } return { "result": txt, }
... View more
07-23-2020
04:30 PM
|
0
|
12
|
5592
|
|
POST
|
All, I am trying to implement an attribute rule which updates a feature class field when the related 1:M table gets updated. The process right now is, the field in the related table changes, we get the keys, roll up the data into a string and then get the related record in the feature class and write the text value. Issue is, a) the for loop is in error and if I comment it our and put a dummy value in my text variable, it doesn't actually write anything. Any help would be appreciated.- thank you var tbl = FeatureSetByName($datastore,"DB.LAND.CustomerAddress"); var fc = FeatureSetByName($datastore,"DB.LAND.AddressLocation"); var locationeguidKey = $feature["locationeguid"]; var CustomerSql = "locationeguidKey = '" + locationeguidKey + "'"; var premiseSQL = "GlobalID = '" + locationeguidKey + "'"; var CustomerAddressResult = Filter(tbl, CustomerSql); var txt = null; var address; for (var address in nonCustomerAddressResult){ txt = Text(address.building_number, ','); } return { if (Count(txt) > 0){ txt = Left(txt, Count(txt) - 1); premiseResult = Filter(fc, premiseSQL) // premiseKey.labeltext = txt; } "result" : txt, "edit":[{ "className": "premiseResult", "updates" : [{ "labeltext": txt }] }] }
... View more
07-23-2020
01:48 PM
|
1
|
14
|
7372
|
|
POST
|
Rich, Thanks, we're inserting into default, it's a table that connects to a feature class through a simply relationship class. I look forward to testing 2.6, it's interesting as Pro's append command will do the same in about 30 minutes (all 1.3 odd million records)
... View more
07-20-2020
06:03 AM
|
0
|
1
|
3114
|
|
POST
|
All, Does anyone know if there is a more efficient mechanism to bulk load records into a Branch Versioned Geodatabase Table. This is a console app, so I am using the ApplyEdits Method ArcGIS Pro 2.5 API Reference Guide And reading and writing in blocks of 2000 records. The target is to load approximately 1.5 million records. In regular C# code (data transformation, etc) with SQL Commands this takes about 4 minutes. It's taking about 12 minutes per 2000 records with the ApplyEdits Method in the core API. Reading 2000 record batches from the source and writing same. The table has a relationship class tied to a Utility Network feature class, so it needs to be and remain branch versioned. Any advice would be greatly appreciated.
... View more
07-17-2020
08:07 AM
|
0
|
14
|
4411
|
|
POST
|
Absolutely agree, it is actually a problem with the selection of the underlying PostgreSQL technology and how it's implemented for the Datastore which causes the primary / secondary scenario from my understanding.. I used HA in the sense that when setup the system should always "be in sync" and primary will fail to secondary smoothly, it's when you try to recover too quickly that you have the issue with the two not "lining up" as it were. It's also why you can only have two portal machines. It would have been good had Portal followed the same HA strategy as ArcGIS Server, but alas, we have what we have.
... View more
12-16-2019
05:54 AM
|
0
|
4
|
2637
|
|
POST
|
We found the same, we scheduled the patch process to shut down the Standby, then shut down the primary and patch it first and then patch and restart the standby system. Unfortunately, this takes you out of HA for a short period of time, but worth it to avoid all the index problems that result. We're hoping that this issue has been addressed in 10.7.
... View more
12-13-2019
05:38 AM
|
0
|
7
|
2637
|
|
POST
|
Hi, Did you ever get this to work? No matter what I do it fails. Built the app from Web AppBuilder on ArcGIS Online - downloaded it to host on the internal network and always prompted for the username / password. Even if I supply the Client_ID and Client_Secret with the token service, oath2 or neither used. Interestingly enough, I can log into ArcGIS Online, but when prompted through the web app my login fails. I had to add "httpProxy": { "useProxy": true, "alwaysUseProxy": false, "url": "https://xxx.xxx.xxx.xxx/proxy/proxy.ashx", "rules": [] }, To the bottom of the config file as it wasn't available in the base app when I downloaded it
... View more
10-04-2019
11:23 AM
|
0
|
1
|
767
|
|
POST
|
Anthony, XRay is an Esri add-in that will output a Geodatabase data model into easily usable / reportable Excel files. I'm hoping to get the Excel files for the particular data models.
... View more
05-16-2019
03:01 PM
|
0
|
1
|
694
|
|
POST
|
Are you looking for tools? Or the kinds / types of work activities to conduct a project like this?
... View more
05-16-2019
09:58 AM
|
0
|
0
|
979
|
|
POST
|
All, Just wondering if anyone has the X-Ray output for the Esri utility models (Desktop / Geodatabase), not the newer utility configurations? This would be for Water Utilities GAS Electric Telecommunication Local Government If you have the x-ray output and don't mind posting them it would be greatly appreciated. Thanks,
... View more
05-16-2019
09:57 AM
|
0
|
3
|
900
|
|
POST
|
David, Sorry about the tardy reply on my behalf. We never did determine the best way to do this (well we did which is the use the survey123.arcgis.com report tool, but that had significant problems with the system and the reporting is very limited through the Word templates). We also ended up using a publicly exposed system in order to make this work, the main system was behind a firewall and wouldn't communicate with the external survey123 site. I think the best bet would be to use an FME / Data Interoperability job to bring the data from the Datastore into a "proper" relational database system for reporting purposes. The blackbox that is the Datastore has certain benefits, but this unfortunately isn't one of them. Kieren
... View more
04-23-2019
11:59 AM
|
0
|
0
|
2344
|
|
POST
|
Fraser, Our ports are open, but indexes are out of sync. Did you shut down the entire server or just the service? We have shut down the service and it didn't correct the issue. Running portal 10.5.1 Thanks, Kieren
... View more
06-04-2018
11:02 AM
|
0
|
0
|
4994
|
|
POST
|
Hi, We have Survey’s hosted on ArcGIS Online. We have republished these survey’s into our in house ArcGIS Portal environment (10.5.1) leveraging the Datastore. We now need to load the data from the ArcGIS Online data to the Portal (exported the ArcGIS Online data to – the obvious tool to do this is the Append tool, however, the Maintain Attachments check box does not in fact maintain the attachments (the feature class loads, but the attachments do not). The Preserve Global ID’s also fails with a -9999 error as it only appears to work with Enterprise Geodatabases. If we could preserve the GlobalID’s then we could simply load the attachment tables. Is there an easy way to do this that I am missing? It would seem there should be, if not, I can add a bunch of fields and attempt to recalculate GlobalID’s after the fact, but this seems a little over the top. Thanks all
... View more
05-29-2018
08:27 AM
|
0
|
0
|
582
|
|
POST
|
Andi, We have a few survey's which use a feature service connected to our enterprise Geodatabase (Oracle in our case), and this works well. Our environment is quite dynamic with different survey's being constructed regularly and the users just publish them to the Datastore. survey123.arcgis.com has a nifty "Report" tool, the issue is that it is in Beta and does not play well with ArcGIS Enterprise though it works quite nicely with ArcGIS Online. It is this type of easy reporting, i.e. click on a record (or select it somehow) and then generate a quick report that we are looking to replicate. We are hoping that come July (the next scheduled release of survey123.arcgis.com) this will come out of beta and will work nicely. Until then, we are looking for solutions 🙂 which if good enough may become permanent. At the moment I am considering a Data Interop type job where the user will submit an ID and the system will extract the data and produce a report. I'll post what I figure out. Kieren
... View more
04-19-2018
04:55 PM
|
0
|
0
|
2344
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 11-07-2017 03:22 AM | |
| 1 | 08-25-2020 12:42 PM | |
| 1 | 07-23-2020 01:48 PM | |
| 2 | 12-22-2017 05:49 AM |
| Online Status |
Offline
|
| Date Last Visited |
09-17-2024
01:53 PM
|