TableToTable_conversion: Problem with SQL Query

387
4
10-10-2011 11:03 AM
by Anonymous User
Not applicable
I've just recently started using Python to try to automate task and be more efficient.

I have a table that I would like to create multiple tables from based on the value of a field.  The problem is when I set my my SQL query in the arcpy.TableToTable_conversion funtion I'm pretty sure the query doesn't recognize the variable I'm trying to reference...but I'm struggling with finding a different way to accomplish my goal.  Is there a way to use variables in an SQL query using this tool?

My code:

import arcpy

#set variables
in_table = r"\\Server\Folder1\Folder2\MeterRouteFix\CogsdaleTest.dbf"
out_path = r"\\Server\Folder1\Folder2\MeterRouteFix"
ThisRoute = 1

MeterTable = "Meters.dbf"
MTableFull = out_path + "\\" + MeterTable
RField = "Route"
RtField = "!RTEID!"

arcpy.Delete_management(MTableFull)

arcpy.TableToTable_conversion (in_table, out_path, "Meters.dbf")

#Needed to make Route field a number so the While loop below could be used
##arcpy.AddField_management(MTableFull, RField, "TEXT","","",3)

arcpy.AddField_management(MTableFull, RField, "SHORT","","")

#This Function wasn't working for some Reason
##arcpy.CalculateField_management(MTableFull,RField,RtField[-3:], "PYTHON")

#UpdateCursor used instead of Calculate Field, probably will have to use this most often
#Help for arcpy.UpdateCursor: http://help.arcgis.com/en/arcgisdesktop/10.0/help/index.html#//000v0000003m000000

rows = arcpy.UpdateCursor(MTableFull)

for row in rows:
    if row.RTEID[-3:-1]=="00":
        row.Route = row.RTEID[-1:]
        rows.updateRow(row)
    elif row.RTEID[-3:-2]=="0" and row.RTEID[-3:-1]<>"00":
        row.Route = row.RTEID[-2:]
        rows.updateRow(row)
    else:
        row.Route = row.RTEID[-3:]
        rows.updateRow(row)

#Delete cursor and row objects in order to remove locks on the data
del row
del rows



#export to multiple tables
while ThisRoute < 130:
  arcpy.TableToTable_conversion (MTableFull, out_path, str(ThisRoute)+".dbf", """ "Route" = ThisRoute """)
  ThisRoute=+1

It will create first table, but fails at the SQL query:

ExecuteError: ERROR 999999: Error executing function.
A column was specified that does not exist.
A column was specified that does not exist.
Failed to execute (TableToTable).
Tags (2)
0 Kudos
4 Replies
JoelCalhoun
New Contributor III
Hopefully something like this will help:

while ThisRoute < 130:
     arcpy.TableToTable_conversion (MTableFull, out_path, str(ThisRoute)+".dbf", '"Route" = ' + "'" + ThisRoute + "'")
     ThisRoute = ThisRoute + 1



Joel
0 Kudos
by Anonymous User
Not applicable
Hopefully something like this will help:

while ThisRoute < 130:
     arcpy.TableToTable_conversion (MTableFull, out_path, str(ThisRoute)+".dbf", '"Route" = ' + "'" + ThisRoute + "'")
     ThisRoute = ThisRoute + 1



Joel


I really appreciate the response, but the above gave me the error: cannot concatenate 'str' and 'int' objects.

I have achieved the results I wanted by creating all of the tables I need (containing all of the data from the original table) and then deleting the unwanted records.  I added the following lines to the end of the code contained in my original post:

dbfTable = 1

while dbfTable < 130:
    NewRouteTable = out_path + "\\" + str(dbfTable) +".dbf"

    rowspurge = arcpy.UpdateCursor(NewRouteTable)

    for row1 in rowspurge:
        if row1.Route <> dbfTable:
            rowspurge.deleteRow(row1)
    dbfTable+=1
    del row1
    del rowspurge


I know there has to be a better way!
0 Kudos
JoelCalhoun
New Contributor III
I'm glad you found a solution.

Though fixing the concatination issue is simple:

while ThisRoute < 130:
     ThisRoute = str(ThisRoute)
     arcpy.TableToTable_conversion (MTableFull, out_path, ThisRoute + ".dbf", '"Route" = ' + "'" + ThisRoute + "'")
     ThisRoute = int(ThisRoute) + 1
0 Kudos
by Anonymous User
Not applicable
Perfect!...funny how a couple months of experience with Python makes this answer seem like it should have been so obvious.

Thank you for your help.
0 Kudos