import arcpy from arcpy import env env.workspace = "C:/data" arcpy.TableToTable_conversion("vegtable.dbf", "C:/output/output.gdb", "vegtable")
Solved! Go to Solution.
There is a easier way, if you don't need to get fancy with the nifty functions of field mappings:
If you create a table for output and add just the fields you want and use Append_management with the NO_TEST option, only the fields with matching names get copied.
Using table to table you would have to use field mappings to setup the conversion from source field to destination field. Alternatively you could use an insert cursor on a template with the field restrictions you want. You can find examples on these forums of field mapping.
Here's an example of me working through some issues I was having with field mapping. Should give you all the pieces you need for that route.
http://forums.arcgis.com/threads/72269-field-map-problems
I am dealing with 7+ million records, so performance is a must. mzcoyle's idea of insert cursor will take too long, and I made a script that worked based off of his example. What kind of performance will I get with the append management? The files that I am starting with are .txt files, and they are split into 2 (roughly 3.5 million per). So I am also trying to figure out the fastest way to get them both together into one file into a file geodatabase.
Is there way to set field length as well? I didn't see it in the help page. Most of the fields I am exporting are of the text variety, and the 255 length eats up way too much space and slows down other operations. Thanks!!
field.length = 10
An insert cursor would be your best performance for this kind of operation. Can you post the code you have where you aren't getting the performance you require?