POST
|
numpy or pandas is likely going to be faster. I always forget about Pandas being available now. I wouldn't hold the import times against them with a Python toolbox. You pay those costs upfront at toolbox initialization unless you're importing inside of your tool classes.
... View more
10-16-2017
02:09 PM
|
0
|
1
|
1325
|
POST
|
I am doing that to determine the unique values. The catch is determining if that number of unique values matches the length of the recordset - which tells me every value in the given field is unique
... View more
10-16-2017
02:05 PM
|
0
|
8
|
853
|
POST
|
Look at you rockstars! Lightning quick to respond. I actually came up with this a few minutes later. Benchmarked it against 465,051 records. I'd like to get it down to 3 seconds or less def calculateRunTime(function, *args):
startTime = time.time()
result = function(*args)
return time.time() - startTime, result
def isUniqueValueField(dataset, field):
idList = []
with arcpy.da.SearchCursor(dataset, field) as cursor:
for row in cursor:
idList.append(row[0])
if len(idList) != len(set(idList)):
return False
else:
return True
>>> calculateRunTime(isUniqueValueField
, parameters[2].valueAsText
, parameters[3].valueAsText)
(8.847000122070312, True)
... View more
10-16-2017
01:59 PM
|
0
|
0
|
1325
|
POST
|
I have a parameter in a python toolbox which allows a user to select a field from a dataset given in a previous parameter. # Feature Class to absorb geometry from
param2 = arcpy.Parameter(
displayName="Geometry Feature Class",
name="in_geoFC",
datatype="GPFeatureLayer",
parameterType="Required",
direction="Input")
# Table ID field
param3 = arcpy.Parameter(
displayName="Table Geometry ID Field",
name="table_geoIDField",
datatype="GPString",
parameterType="Required",
direction="Input",
enabled=False)
param3.filter.type = "ValueList"
param3.filter.list = [] The updateParameters section contains logic to update Parameter 3 with the field names for all of the fields in the dataset provided in parameter 2 which have a datatype of string: def updateParameters(self, parameters):
"""Modify the values and properties of parameters before internal
validation is performed. This method is called whenever a parameter
has been changed."""
# if 'in_geoFC' is populated with a value
if parameters[2].value:
# if 'in_geoFC' does not have an error set
if not parameters[2].hasError():
# Create a list of all of the fields in the 'in_geoFC'
# which have a datatype of 'String'
fc_geoIDFields = [field.name for field in arcpy.Describe(
parameters[2].valueAsText).fields
if field.type == 'String']
# Enable the parameter
parameters[3].enabled = True
# Populate the parameter with the list of text fields in the
# table
parameters[3].filter.list = fc_geoIDFields This all works just fine... There's one more thing I need to do though. Whichever field the user selects for parameter 3, I need to ensure that the values contained in this field are unique - every record must have a unique value. I know of a pretty easy and elegant way to get the number of unique values in that field: len(set(r[0] for r in arpy.da.SearchCursor(parameters[1].valueAsText
, parameters[2].valueAsText))) What I don't know is the quickest and most elegant way to get the total number of features in that dataset so that I can compare it to the number of unique values in that field and thus, determine if all of the values in that field are unique. Keep in mind that this would be occurring in the updateMessages function, so it needs to be a fairly quick process. Any thoughts on how to get a record count super fast?
... View more
10-16-2017
01:27 PM
|
0
|
22
|
3834
|
POST
|
Thanks guys. I figured this was probably something that was implemented but wanted to check with the SMEs to make sure I wasn't missing something. Boo, Esri. Fix it. https://community.esri.com/ideas/14090
... View more
10-13-2017
02:05 PM
|
0
|
0
|
915
|
IDEA
|
This doesn't work. param0 = arcpy.Parameter(
displayName="Table",
name="in_Table",
datatype="GPTableView",
parameterType="Required",
direction="Input")
param0.filter.list = ["Table"] It should. Please fix it.
... View more
10-13-2017
02:05 PM
|
3
|
1
|
683
|
POST
|
Trying to set a filter on a Python toolbox parameter to only allow tables to be input. Currently doing this: param0 = arcpy.Parameter(
displayName="Table",
name="in_Table",
datatype="GPTableView",
parameterType="Required",
direction="Input")
param0.filter.list = ["Table"] However, the filter doesn't seem to work.
... View more
10-13-2017
11:04 AM
|
0
|
3
|
1387
|
IDEA
|
Currently, when you drag and drop a CSV or Text file into a web map in Portal or AGOL, the Portal attempts to geocode that file and display the records on the map - which is a fantastic feature. However, what happens if the data in your file represents a Standard Geography like Blockgroups, Tracts, Counties, ZIPs, CBSAs, Congressional Districts, DMAs or US States? I'll tell you what happens - the Geocode returns the result as the centroid of each of those geographies. While that makes sense if you technically understand what's going on behind the scenes, in practice it's actually delivering a really poor user experience. Let's use the example of U.S. States. I have a CSV or Text file that contains 50 records. Each record has let's say - 10 attributes. of those 10 attributes, one of those attributes is a field called 'NAME' or 'STATE' or 'ST' or 'STATE_ID'. When I drag this CSV or Text file onto a map in AGOL or Portal, Portal should analyze my file before sending it to the Geocode. Specifically, it should: Assess whether or not I have the exact same number of records as a known standard geography, such as States - which would be 50 records. (Blockgroups: 217,740 records, Tracts: 73,057 records, Zips 43,000 records, Counties: 3,142 records...so on and so forth) Determine if each record is unique Determine if each record can be mapped to a single geography within that standard geography If these conditions are met, ArcGIS Online should not send the dataset to the Geocoding Service as the user probably doesn't want centroids - the user wants to create a cloropleth (aka. color coded map). So what it should do instead is automagically join the data to the matching standard geography then pass it off to Smart Mapping to help the user classify and symbolize their data.
... View more
10-13-2017
09:39 AM
|
1
|
0
|
517
|
IDEA
|
Currently, when you drag and drop a CSV or Text file into a web map in Portal or AGOL, the Portal attempts to geocode that file and display the records on the map - which is a fantastic feature. However, what happens if the data in your file represents a Standard Geography like Blockgroups, Tracts, Counties, ZIPs, CBSAs, Congressional Districts, DMAs or US States? I'll tell you what happens - the Geocode returns the result as the centroid of each of those geographies. While that makes sense if you technically understand what's going on behind the scenes, in practice it's actually delivering a really poor user experience. Let's use the example of U.S. States. I have a CSV or Text file that contains 50 records. Each record has let's say - 10 attributes. of those 10 attributes, one of those attributes is a field called 'NAME' or 'STATE' or 'ST' or 'STATE_ID'. When I drag this CSV or Text file onto a map in AGOL or Portal, Portal should analyze my file before sending it to the Geocode. Specifically, it should: Assess whether or not I have the exact same number of records as a known standard geography, such as States - which would be 50 records. (Blockgroups: 217,740 records, Tracts: 73,057 records, Zips 43,000 records, Counties: 3,142 records...so on and so forth) Determine if each record is unique Determine if each record can be mapped to a single geography within that standard geography If these conditions are met, ArcGIS Online should not send the dataset to the Geocoding Service as the user probably doesn't want centroids - the user wants to create a cloropleth (aka. color coded map). So what it should do instead is automagically join the data to the matching standard geography then pass it off to Smart Mapping to help the user classify and symbolize their data.
... View more
10-13-2017
09:39 AM
|
1
|
0
|
496
|
POST
|
You might try the workaround I used. Create an empty web map in ArcGIS Online which uses the Vector Basemap. Save that empty web map as something like "Navigation - Vector" in ArcGIS Online Sign into ArcGIS Online in Pro Open your "Navigation - Vector" web map. Another approach - though I haven't tried it could be to save the Vector Basemap layer in your Webmap as a Layer file in ArcGIS Pro once you've done the above. Then just add that layer file to your web map anytime you needed it. That would theoretically allow you to delete the "Navigation - Vector" web map you created. Again, I've not tested this on my end but in theory, it should work.
... View more
10-13-2017
09:22 AM
|
0
|
1
|
1012
|
IDEA
|
Assignment Dependencies - that is the ability to make the starting of one assignment dependent on another assignment having a specific status, would be a very useful feature. Consider the following scenario: Assignments A, B and C and assigned to workers 1, 2 and 3 respectively. Each assignment is given a Due Date. Assignment B is dependent on Assignment A having a status of Complete. Assignment C is dependent on Assignment B having a status of Complete or Declined. Once a project is established with these interdependent assignments, the end user workflow would be something like the following: Worker 1 starts Assignment A, changing it's status to In-Progress. Workers 2 and 3 who are assigned Assignments B and C respectively (which are in turn dependent on Completion/Decline of Assignments A and B respectively) would see their assignments in the Workforce App, but the ability for them to start the assignments would be disabled and indicate that their dependencies have not yet been met. Worker 1 Completes Assignment A Worker 2 who is assigned Assignment B, which is dependent on Completion of Assignment A, would receive a notification that all dependencies for Assignment B have been met. Worker 3 who is assigned Assignment C, which is dependent on Completion or Decline of Assignment B sees no change. Assignment C still cannot be started and indicates that not all assignment dependencies have been met. Worker 2 Declines Assignment B Worker 2 who is assigned Assignment B (which has had all of its dependencies met) Declines Assignment B. Perhaps this is because they review the notes for Assignment A (which maybe they could do since it is a dependency of Assignment B) and they see something in Assignment A's notes that indicates Assignment B should not be performed. In any case, Worker 2 declines Assignment B. Worker 3 receives a notification that all dependencies for Assignment C have been met. Worker 3 Completes Assignment C Worker 3 reviews the notes of Assignment B and determines that Assignment C should be completed. Worker 3 completes Assignment C and marks it as such within the Workforce app. This kind of interdependent assignment would allow organizations to use Workforce much more dynamically for longer running assignments such as construction projects, store openings, relocations or any other project that has lots of moving parts. It doesn't need to be overly complicated either. I don't think there would be a need to allow assignments to be dependent on multiple assignments because using the workflow above, that could be achieved by creating "dependency-chains" (ie. Assignment D is dependent on Assignment C which is dependent on Assignment B which is dependent on Assignment A). I realize this is actually a really, really big scope but I think the use case would add a ton of value to Workforce.
... View more
10-11-2017
06:34 AM
|
6
|
1
|
638
|
IDEA
|
I'm not sure I would advocate allowing access to maps without any form of authentication. I think a better solution might be to support some form of local authentication. This could be done by generating hash values for a user's credentials when they successfully authentication while their device has connectivity. Those hash values could then be stored locally by Collector and when a user was required to authenticate while offline, Collector could simply hash the credentials the user is given and determine if the hash value for the credentials the user is giving matches the hash value of the credentials Collector stored previously. This would allow local authentication without you having to be online. In any case, I support resolving the core issue this idea presents - authenticating while disconnected.
... View more
10-09-2017
05:48 AM
|
0
|
0
|
1089
|
IDEA
|
FYI: At UC17, the EEAP (Esri Enterprise Advantage Program) Special Interest Group meeting announced that System Monitor would be an entitlement for customers that subscribe to the EEAP. So if your organization subscribes to the EEAP, you'll get System Monitor at no cost. If you don't currently subscribe to EEAP, depending on the cost of SM on its own, an EEAP subscription might make sense given the other benefits that come with that program.
... View more
10-06-2017
07:24 AM
|
0
|
1
|
1690
|
Title | Kudos | Posted |
---|---|---|
1 | 06-11-2015 12:02 PM | |
2 | 02-04-2016 02:35 PM | |
1 | 04-11-2017 12:51 PM | |
1 | 08-07-2015 11:00 AM | |
4 | 06-19-2015 01:44 PM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|