|
POST
|
Thanks that worked in FME as well. I caught up the missing days by adding in a where Statement created_date >= '2020-07-02 00:00:00' on each of the readers. I now have the readers with a where statement of created_date >= CURRENT_DATE() and run it each day to grab the tracks. I get that it was a known limitation but why replace a working product with one with such a big limitation and only have it fixed a year later. I wouldn't have thought it to be an issue because AGOL only keeps 30 days worth of tracks soI thought that that was the limiting factor so as long as we processed within 30 days we should be fine. It would be nice to wait until the new product works fully so people who rely on the old product aren't left trying to figure out what no longer works and workarounds so patch up processes. Or at least allow them to run in parallel until the bugs are worked out of the new one. I appreciate the innovation but it is hard to constantly have to discover how things no longer work or don't work as expected. Thank you for your quick reply and solution.
... View more
07-08-2020
08:15 AM
|
0
|
0
|
3144
|
|
POST
|
How are we supposed to extract the tracking layer if you only allow the first 28800 records? We used the tracking layer in Collector for a few years without issue. We were then forced to pay for tracker and now we can only extract the first 28800 records? This needs to be fixed right away. We need to read the features into our geodatabase to cover off legislative requirements for road and sidewalk patrols and use FME to extract the features each day. It was working fine until we hit the magic 28800 features limit. Stop making half baked products, your customers rely on actually using them for more than just product demos.
... View more
07-08-2020
06:17 AM
|
0
|
2
|
3144
|
|
IDEA
|
It would be nice is the AGOL administrators could set up a permanent password for users that never expires. Managing field crew password resets is very difficult with users that are not familiar with computers or phones and who may not even have email access. Basically we have to create users with our own email with an initial password, sign in as that user and then reset the password to a new standard password. We tried the proper route of each person having a unique password set by themselves but after taking at least a half-hour working through them trying to set up their initial password followed by them forgetting their password when the device decided to log them out and having to go through the entire cycle again, we have decided that front-loading standard passwords for each group is the only way to manage it effectively. It would be great if when the user was first created there was an option of password never expires so we wouldn't have to go through the hack to do it.
... View more
03-20-2020
07:12 AM
|
3
|
1
|
1160
|
|
POST
|
It is also happening with oracle but not all the time. We have 122 Open data layers that we export daily through Pro/Python script run through a windows batch file. All but a handful of them work and previous to a week ago all have worked. We have run in to this before but could manually run overwrite the webservice out of ArcPro but now we get the fairly useless ERROR 999999: message even when manually trying to update the hosted features. The other 118 still work both manually and through the script. The log gives no extra information. We are using: Windows server 2012 R2 ArcPro 2.5 Oracle 12C ArcGIS 10.7 geodatabase We have done all the suggestions from all posts I could find (see below). Most suggestions also assume that it hasn't been working for months before suddenly getting this error. Works fine on roughly 118 or 122 hosted services that are updated through same method/script Has worked for months all arcpro documents are in same folder location on the local machine data all from same geodatabase, same db owner Connected as database owner same AGOL user and hosted folder, share setting etc tried to a new hosted layer - no luck have pro pointing to local directories for it's processing rebooted machine Lots of room on local comuter tried coping layers to new map > no luck deleted temp files file path is not too long, working ones have longer names plus it worked previously No analyze errors Also checked geometry checked symbology No new domains No relationship classes feature class size, one Display field in Pro is valid using default python environment in Pro Used fiddler while trying to publish. Nothing pops out SDDraft is created in: C:\Users\<username>\AppData\Local\ESRI\ArcGISPro\Staging\My Hosted Services (NAME) Possible problem with creating index (no log.txt in Maps>index>LAYER folder) > rebuilt indexes > no change Missing publishingResults.json in C:\Users\<username>\AppData\Local\ESRI\ArcGISPro\Staging\SharingProcesses\<Job ID>\ and found missing the publishingResults.json Is it actually failing on Consolidate data? vs Staging is set to E:\ArcGISPro\Staging\SharingProcess but it is putting it in C:\Users\adminsde\AppData\Local\ESRI\ArcGISPro\Staging\SharingProcesses It would be nice if Esri would add all of these checks and what ever else is making it fail to the analyze tool so we can understand what the issue is or give better error messages when things don't work. Any other suggestions of what to try are welcome.
... View more
03-12-2020
11:23 AM
|
0
|
0
|
2968
|
|
IDEA
|
It would be nice to be able to have the default extent in dashboard be dynamic based on feature layer. This way the initial extent of the map would be focused on the current features you want to highlight in the dashboard. For example in a dashboard for road closures instead of using the extent of the map used in the dashboard, you could instead select a layer in the map setting within dashboard and it would zoom to the current road closures. There should be a new tab for map extent that allows you to select the layer and a % zoom. That way as the data changes the initial extent of the map would relfect those changes.
... View more
02-21-2020
07:21 AM
|
10
|
0
|
685
|
|
IDEA
|
I totally agree. We were using tracking layers for several apps only to have to apologize to staff that they no longer work. Worse still what is supposed to be replaced with doesn't really exist outside of demos at conferences. On top of that we are on an ELA so have to purchase tracker up until the end of our ELA and then negotiate to have it included in the renewal. It is great that Esri has ELAs but what is the point if they randomly remove functionality and them make you pay for a re-imagined version of what we just lost. I would be happy to pay for something new that I need but in this case what was offered worked for us and now we are being asked to pay for functionality we don't need and complexity we don't want (if we use the enterprise version portal/data store/track viewer). We had a simple workflow: tracks are created as a tracking layer in AGOL daily and them imported each night in to our geodatabase. Yearly tracks are removed from AGOL. Very simple, users only need to use one app. Is there any documentation comparing functionality, restrictions, costs and data model of Tracker for AGOL vs Tracker for Portal? We have several questions about how to export the tracks to our current geodatabase layer and how easy it is to truncate the tracks from AGOL or Portal periodically. The way the tracking layer was removed and tracker was released was very poorly executed.
... View more
01-15-2020
11:22 AM
|
0
|
1
|
1818
|
|
IDEA
|
Currently you can set allow NULL values at the feature class and feature table level but it only works for non-versioned data. We would like to have the option to set it in a map document so that is is easily enforced at the service level in applications such as collector. The field details show what the database has but if a second option was added like it was for Read-Only at the map level it would allow much greater flexibility without having to edit the service post publishing if that is even possible. So far it seems you can only change the value on hosted services but we edit on the geodatabase mostly.
... View more
01-09-2020
08:43 AM
|
6
|
0
|
1162
|
|
IDEA
|
Charging extra for tracker is total BS. We are long time Esri users on on a ELA. We were using tracking in ArcCollector just fine up to the new version which doesn't support it anymore. Tracker has not automatically been included in our ELA so we have to wait to negotiate it the next time we renew or purchase outside the ELA? What is the point of an ELA when working functionality is replaced with something that is not included. Same thing happened with ArcGIS Monitor except it's pricing is hugely out of whack. It's a great product but most of us GIS types just want to monitor our services with it and not use it as a network monitoring solution which it is priced as. Esri has increased the complexity of the servers with portal and have taken away the piece to monitor it with. So frustrating.
... View more
11-28-2019
07:11 AM
|
3
|
0
|
1818
|
|
IDEA
|
I would like to search for content that is not by people in our organization. I want to be able to see what others are doing with our open data or other data sources in our area. With the current search I have to scroll through everything we have created and pick through the list to find what I am looking for.
... View more
10-10-2019
08:57 AM
|
6
|
0
|
434
|
|
IDEA
|
Esri provides great solutions that you can easily deploy but they have not included any feature level documentation, XML Workspace Documents or File Geodatabase to go along with the solutions. Why would you need this for an online solution? We need to give documentation to other users so they can decide if the fields are what they need or if we need to add more or change them to suit our organization. Often times domain values also need to be tweaked to suit our organization. We also need a way to compare the solutions over time. The blogs are a great starting point but we need to see what the schema changes are so we can evaluate if we need to use the new feature layer or keep the one we were using. Currently to do this we need to export each layer as FGDB and then download it from AGOL. We then have to export the FGDB to SDE geodatabase so we can produce a report with field information for the planning group to decide if they want changes. By providing feature level documentation, XML Workspace Documents or File Geodatabases to go along with their solutions would save a lot of time.Howard Crothers
... View more
08-16-2019
09:20 AM
|
8
|
0
|
907
|
|
IDEA
|
Please create a proper executable instead of the AppxBundle for Collector for ArcGIS on Windows. The entire program should be able to be installed using the install file. The windows store is too flakey and involved between firewalls, security group permissions, and azure active directory connections. We actually had to temporarily unblock X-box live on our firewall to get the windows store to finally install it. Side loading is not an option for many organizations and it only worked on a few machines. Please just release it as a proper program and not an app for windows 10.
... View more
05-15-2019
08:24 AM
|
13
|
0
|
645
|
|
IDEA
|
Please create a proper executable instead of the AppxBundle. The entire program should be able to be installed using the install files. The windows store is too flakey and involved between firewalls, security group permissions, azure active directory connections and side loading is not an option for many organizations. Please just release it as a proper program and not an app for windows. Please up vote this idea to make a proper install for Collector on Windows 10 https://community.esri.com/ideas/12845-make-collector-for-arcgis-on-windows-10-available-for-download-directly-from-esri
... View more
05-15-2019
08:19 AM
|
0
|
0
|
1765
|
|
DOC
|
The options seems to have been removed, is there a new way of doing this?
... View more
05-15-2019
06:46 AM
|
0
|
0
|
28588
|
|
POST
|
Here is an article sent to me from Esri (but not Esri supported) about installing these bundles on Windows 10 that may help: https://www.howtogeek.com/285410/how-to-install-.appx-or-.appxbundle-software-on-windows-10/ As well the newest version currently is 18.0.2 and it can be found on Esri Downloads
... View more
02-21-2019
06:42 AM
|
0
|
0
|
625
|
|
IDEA
|
We ended up adding a loop to look for an exact match. I still think you should be able to just have a search function return an exact match directly. # Find the SD, update it, publish /w overwrite and set sharing and metadata print("Search for original SD on portal…") #Check that it found the correct layer match=0 layerno = 0 while (match == 0): sdItem = gis.content.search(query="title:"+ sd_fs_name + " AND owner: " + user, item_type="Service Definition")[layerno] foundlayer = sdItem.title if foundlayer == sd_fs_name: match = 1 print("Layer Returned: " + foundlayer + " vs " + sd_fs_name) else: match = 0 layerno = layerno + 1 print("Layer Returned: " + foundlayer + " vs " + sd_fs_name + " Index No: " + str(layerno)) #sdItem = gis.content.search("title:{} AND owner:{}".format(sd_fs_name, user), item_type="Service Definition")[0] sdItem = gis.content.search(query="title:"+ sd_fs_name + " AND owner: " + user, item_type="Service Definition")[layerno] print ("Search title:{} AND owner:{}".format(sd_fs_name, user)) print("Found SD: {}, ID: {} n Uploading and overwriting…".format(sdItem.title, sdItem.id)) sdItem.update(item_properties={'tags': itemtags, 'snippet': itemsummary, 'accessInformation': itemcredit}, data=sd, thumbnail=itemThumbnail) print("Overwriting existing feature service...") fs = sdItem.publish(overwrite=True) if shrOrg or shrEveryone or shrGroups: print("Setting sharing options…") fs.share(org=shrOrg, everyone=shrEveryone, groups=shrGroups) rundate=datetime.datetime.now() print("Updated: {} – ID: {}".format(fs.title, fs.id)," at " + rundate.isoformat())
... View more
02-13-2019
09:52 AM
|
3
|
0
|
7252
|
| Title | Kudos | Posted |
|---|---|---|
| 3 | 02-10-2025 09:11 AM | |
| 7 | 10-24-2024 08:36 AM | |
| 9 | 05-14-2024 06:34 AM | |
| 9 | 04-02-2024 06:36 AM | |
| 6 | 01-09-2024 06:02 AM |
| Online Status |
Offline
|
| Date Last Visited |
2 weeks ago
|