POST
|
Has anyone had any luck using the prefix in an SQL_CLAUSE? I am using ArcGIS 10.1 SP1 with a 10.0 File Geodatabase. I would like to read a single text column from a large polygon attribute using a data access cursor. Originally I used some other Python methods for creating a unique list of sorted values, but when the featureclass got to be very large, performance began to suffer. The larger the list gets, the longer it takes to determine whether a value already exists in the list. If I correctly understand the SQL_CLAUSE parameter for da cursors, I should be able to use the "DISTINCT" prefix clause to limit the number of records I have to read. The unique MUKEY values will be used to populate a list. # Create list of MUKEY values from input mapunit polygon featureclass
mukeyList = list()
muCnt = 0
arcpy.SetProgressor("step", "Reading MUKEY values...", 0, muTotal)
with arcpy.da.SearchCursor(inputFC, ("MUKEY"), "", "", False, ("DISTINCT", "ORDER BY MUKEY" )) as srcCursor:
# Create a unique, sorted list of MUKEY values in the MUPOLYGON featureclass
for rec in srcCursor:
mukeyList.append(rec[0])
muCnt += 1
arcpy.SetProgressorPosition()
arcpy.SetProgressorLabel("Reading MUKEY values (" + Number_Format(muCnt, 0, True) + ")")
What's funny is the the ORDER BY postfix is working, but the list ends up containing an MUKEY for every polygon, including all the thousands of duplicate values. If anyone has gotten the SQL_CLAUSE prefix to work properly, I would greatly appreciate a point in the right direction. -Steve
... View more
02-03-2014
07:30 AM
|
0
|
0
|
801
|
POST
|
Thanks Jill! Just have to be patient I guess. -Steve
... View more
11-26-2013
03:01 AM
|
0
|
0
|
275
|
POST
|
I am trying to create an ArcTool using Python in ArcGIS 10.1 SP1. I want to display a two column value table containing string values. Edit: After rereading the documentation, I discovered that I've been mixing code from standard script tools with that of the new-at-10.1 Python Toolboxes. Value tables aren't a supported parameter dataType for the original script tools, which is rather frustrating. There will 3 parameters. The first is a text file, second is a temporary string parameter for displaying error messages in my Tool Validation class. The third parameter is my value table. I found a few bits and pieces of value table code for the tool validation class, but I'm doing something wrong right out of the gate. I keep getting an error message from the getParameterInfo function: AttributeError'>: 'NoneType' object has no attribute 'addRow'. Evidently I'm not defining the table properly and it fails when trying to create the object. I don't actually try add a row to the table, that error gets thrown from the first 'param2' line. I've tried several variations and set the multValue property to both True and False with no difference. It isn't clear to me whether I need to set the dataType property as composite. Also it is not clear to me whether I need to create this parameter in the Tool Properties interface before hand. Anybody have a complete working example I could look at? I would greatly appreciate it! -Steve
def getParameterInfo(self):
#Define parameter definitions
try:
# Value Table parameter
param2 = arcpy.Parameter(displayName="Value Table", name="in_featdesc",
datatype= "GPValueTable",parameterType="Required",
direction="Input",multiValue=True)
param2.columns = [["FEATSYM","String"],["FEATDESC","String"]]
#param0.defaultEnvironmentName = "workspace"
return
except:
tb = sys.exc_info()[2]
tbinfo = traceback.format_tb(tb)[0]
theMsg = tbinfo + "\n" + str(sys.exc_type)+ ": " + str(sys.exc_value)
self.params[1].value = theMsg
... View more
06-29-2013
04:51 AM
|
0
|
0
|
1493
|
POST
|
Hi, I just started seeing similar issues when running an ArcSDE-to-FGDB export script. I hadn't run it for a couple of months, but yesterday I was running a small export of about 60 tables and 5 featureclasses from our ArcSDE 10.1 database (SQL Server 2008R2). Usually the failure was during the Copy_managent command. There was a strange pattern. I first experienced the problem when running it on my laptop. When exporting to a FGDB under a folder on my C drive, it would fail at different times, copying different tables. When I changed my output location to another (slower) internal drive C: it would work every time. The failure was during Copy_management and the error message was not helpful. After each failure I would check the input table and output geodatabase and could find no problems, everything seemed to be fine and I could manually use Copy_management without problem. My laptop drives are C: Micron SSD and 😧 Hitachi TravelStar ATA 500GB Next I moved operations to another higher performance desktop computer. I saw very similar problems, except that the first failure to export to a FGDB under C:\Geodata occurred near the end of the process. All of the tables and featureclasses were successfully copied and the script was in the process of recreating the relationshipclasses. The error message indicated that the output FGDB was read-only. Subsequent attempts to export to c:\Geodata failed after a couple of minutes while copying tables. It did not fail on the same table each time. As on my laptop, I tried changing my output location to another internal E: drive and it worked perfectly every time. Both my C and E drives are Intel 300GB SSDs. About 5 months ago I ran all of these exports hundreds of times for terabytes of data without seeing any issues like this. My current theory is MS Endpoint Protection or a Windows 7 update. We didn't have EndPoint 5 months ago. The C:\Geodata folder is a location that supposed to be excluded from scanning so my results are opposite of what I would have expected, but I'm not sure if the folder exclusions apply to the 'Real time protection' process. We don't have any FGDB file extensions or SQL server database file extensions in our exclusion list. I've been trying to get IT to add those but no luck so far. Next step is ESRI support. It would be helpful if ESRI would put out a white paper or technical bulletin on the subject of virus scan exclusions. An official document would smooth the way tremendously. -Steve
... View more
05-14-2013
05:22 AM
|
0
|
0
|
182
|
POST
|
I'm not sure that I can call this truly solved, but as a workaround, I found that if I converted "E:\Temp" to an ArcInfo workspace with INFO directory, the polygon to raster conversion seems to work fine. -Steve
... View more
05-10-2013
05:04 AM
|
0
|
0
|
211
|
POST
|
I just installed 64-bit geoprocessing and wanted to test one of my custom Python-based ArcTools for converting polygons to a geodatabase raster. The ArcTool works fine in foreground, but fails when the background processing option is selected. The failure occurs on the line where I am using a Con statement with FocalStatistics to create a temporary raster. The GDAL driver tries but fails to create a temporary grid (g_g02) under "C:\Program Files (x86)\ArcGIS\Desktop10.1\bin64". I don't have permissions to write in that directory and would prefer it be put somewhere else even if I did. I made sure that I have my system environmental variables for TEMP,TMP and ARCTMPDIR all set to "E:\Temp" and my scratchworkspace setting is "E:\Temp\scratch.gdb". I also checked to make sure that the input raster for FocalStatistics does exist and doesn't have any problems. All my input and output data are being stored on local drives. Any suggestions would be welcome. Thanks, Steve
... View more
05-07-2013
04:27 AM
|
0
|
1
|
829
|
POST
|
Jocelyn, There can be legacy restrictions involved when the output format is GRID. If you are having problems that only occur when the output location is a specific drive or folder, then my first thought would be to check for a path that cannot be a valid ArcINFO workspace. The complete list is documented somewhere (http://webhelp.esri.com/arcgisdesktop/9.2/index.cfm?TopicName=About_the_ESRI_grid_format lists some), but the first things I look for are spaces in the path or strange characters. I also try to keep paths short. Another problem I have run into is when a conversion tool automatically names the output GRID using the same filename as the input. If you have a long shapefie name that can cause the tool to fail with a useless error message. This may only apply to older versions of ArcGIS. Some of these problems have been getting better with ArcGIS 10.1. -Steve
... View more
04-06-2013
04:30 AM
|
0
|
0
|
446
|
POST
|
dmhoneycutt, Thanks. I tried the FIDSet method and got some interesting results using ArcGIS 10.0. When FIDSet was used to get a count for the selected set, it worked really fast when compared to GetCount. For some reason I was unable to get FIDSet to work against a featurelayer containing the entire featureclass (3.3 million records). I was getting back an empty string. Not at at all what I was expecting. There must be something wrong with GetCount in ArcGIS 10.0. I decided to try GetCount one more time, just to get the total number in the featureclass and surprisingly it worked fine. So for now I am using a combination of FIDSet to get the number selected and GetCount for the total number of features in the featureclass. It turns out that my original slow performance was completely due to using GetCount on the featurelayer. I wasn't able try the da cursor since my ArcTool has to be compatible with both 10.0 and 10.1. Thanks anyway for the suggestion.
... View more
04-03-2013
06:05 PM
|
0
|
0
|
438
|
POST
|
I have an ArcTool that runs a Python script. One thing I've always liked about out-of-the-box tools in ArcMap is that they often have the 'Use Selected Features' checkbox. If nothing else they warn the user that the process may only be applied to part of the layer instead of the entire thing. This checkbox needs to be enabled when the input is a featurelayer. The box is automatically checked when there is already a selection on that layer. If the layer is a featureclass, then the checkbox is disabled. I've tried programming that into the Tool Validation class and it sort of works, but using GetCount to compare the featurelayer recordcount to the featureclass record count is way too slow for large datasets. This causes the tool to 'hang' for quite a while. I really don't care how many records are selected, just if there is a selection- True or False. Does anybody have a suggestion for quickly finding out if a selection has been applied? I'm hoping there's a layer property in arcpy.mapping that might work, but I don't see anything obvious. -Steve
... View more
04-03-2013
10:38 AM
|
0
|
5
|
689
|
POST
|
I need to get all the properties for a geodatabase relationshipclass. I can use a Python script to get most of the properties via the geoprocessing "Describe" but unless I'm missing something, the names of the primary and foreign key fields aren't exposed to this method. Can someone please point me in the right direction? There has to be a way. -Steve
... View more
01-09-2013
04:09 AM
|
0
|
3
|
665
|
POST
|
Hi Matt, I have used JoinField with Python script tools for 9.3.1, 10.0 and 10.1 SP1. In most cases I used them for featureclass to standalone table 'joins' or raster to standalone table 'joins' within a file geodatabase. I usually don't use JoinField in a table to table join and never across two different workspaces. The only issue I've ever had was at 9.3.1 I had to select all records in the target layer or table to get JoinField to work properly. Otherwise I ended up with the new field(s), but NO data. No error messages were ever generated when it failed either. I haven't had any problems with 10.1, so I dropped the 'select all process' from my scripts. I know there was at least one thread in the forums with folks getting partial data population. I that there was a bug report for this issue, but I never saw that kind of behavior. It was all or nothing for me. Another tidbit, I experienced the same behavior whether I ran JoinField from a script or from ArcToolbox. It is easy enough to test using 'SelectLayerByAttribute'. -Steve
... View more
12-10-2012
04:17 AM
|
0
|
0
|
147
|
POST
|
Hi, In ArcGIS 10.1, I've run into the same problem as kpeter (also SQL Server, Identify column as primary key). Does anyone have a workaround that will allow the table to be exported with a new OBJECTID column instead of changing the type of an existing column from integer to OID? Depending on which geoprocessing method you use to export the table, I've even seen where the original column gets renamed to OBJECTID. That's not cool. -Steve
... View more
11-16-2012
09:26 AM
|
0
|
0
|
941
|
POST
|
Matshopp, I do not believe that any geoprocessing methods are exposed that will allow a user to permanently change a geodatabase field alias. I'm not sure why, it seems like it would be a very useful geoprocessing tool. Hopefully someone else can tell me that I am incorrect. For table aliases, you can use the arcpy.AlterAliasName(table, alias) method. I think this is new at 10.0. In 9.x geodatabases, it was possible (but definitely NOT recommended) to alter aliases by editing the GDB_FieldInfo table. If done incorrectly you could trash the geodatabase so I won't go into further detail. If you have experience with ArcObjects, you can use the esriGeodatabase.IClassSchemaEdit and AlterFieldAliasName methods. I have used this with ArcGIS 9.3 and MS SQL Server but haven't checked to see if these still apply at 10.x. -Steve
... View more
11-16-2012
05:35 AM
|
0
|
0
|
263
|
POST
|
Thanks Jake. MyBad for not looking a little harder. The AddIndex_management tool documentation doesn't mention any of those restrictions and in fact uses a file geodatabase in their example code. -Steve
... View more
09-28-2012
04:36 AM
|
0
|
0
|
990
|
POST
|
I am having problems creating an attribute index on file geodatabase tables that are UNIQUE and ASCENDING. The FGDB tables contains data that originated from an ArcSDE-SQL Server database. It had indexes on the primary key fields (all character) that according to the table properties dialog in ArcCatalog are all UNIQUE and sorted in ASCENDING order. After copying some of the data to new tables in a file geodatabase, I have been unable to duplicate the indexes in the file geodatabase. I have tried this under 9.3.1 and 10.0 with the same results. I have tried this through ArcToolbox, through the table properties/indexes dialog and through scripting. I've tried it with several different tables and file geodatabases. No error messages, but the resulting indexes always ends up being NON_UNIQUE. I thought perhaps there was a problem with creating unique indexes on a character or text field, so I tried creating a new field (LONG) and calculating it equal to the OBJECTID. I wasn't able to create a UNIQUE index on that field either. The last thing I tested was using a personal geodatabase and guess what, it works fine. The only problem is that I can't use a personal geodatabase because of size limitations. Any suggestions? Am I worrying about nothing? It seems that with a large database it would be important to have the correct type of index. -Steve
... View more
09-27-2012
08:39 AM
|
0
|
2
|
3215
|
Title | Kudos | Posted |
---|---|---|
1 | 09-17-2010 05:10 AM |
Online Status |
Offline
|
Date Last Visited |
11-11-2020
02:23 AM
|