Select to view content in your preferred language

Conversion from CSV to DBF altering data...

7655
30
Jump to solution
02-04-2016 01:19 PM
DannyLackey
Deactivated User

So, I successfully converted my .csv to a .dbf with this script:

env.workspace = r"c:\Output"

inTable = r"c:\Output\test.csv"
outLocation = r"c:\Output"
outTable = "test.dbf"

arcpy.TableToTable_conversion(inTable, outLocation, outTable)

The problem is, in the resulting .dbf file, its adding a decimal with trailing zeroes to the value:

750050

becomes

750050.00000000000

How can I avoid this?

1 Solution

Accepted Solutions
XanderBakker
Esri Esteemed Contributor

This is my schema.ini

[test2.csv]

ColNameHeader=True

Format=TabDelimited

Col1=X Long

Col2=Y Long

Col3=START_DTTM DateTime

Col4=END_DTTM DateTime

Col5=Value Double

View solution in original post

30 Replies
DarrenWiens2
MVP Honored Contributor

I believe you'd have to use the Field Mapping parameter of TableToTable (which uses FieldMap and FieldMappings objects) to force the values into a field of LONG datatype.

Tip: set up the field mapping in the GUI tool, run it, then "Copy As Python Snippet" through the Results window to get the syntax correct.

XanderBakker
Esri Esteemed Contributor

Not sure, but perhaps ArcGIS honors a schema.ini file: see Schema.ini File (Text File Driver)

ChrisSmith7
Honored Contributor

I can confirm, ArcGIS does indeed honor schema.ini files - I use these when working with flat files/CSVs to ensure data types (e.g. to keep leading zeroes, etc).

DannyLackey
Deactivated User

Ok.  I have a schema.ini file.  Do I need to point the script to that file in some way or does it simply need to be present in the same directory as the script or output?

0 Kudos
ChrisSmith7
Honored Contributor

Just make sure it's in the same directory - I create the schema.ini file in Python and write it to the directory where the table will go. So long as it's formatted correctly and in the same folder (you'll have to specify the target file in schema.ini), ArcGIS should pick-up on the driver to determine db structure. You can see this in action when viewing the tables in ArcCatalog.

DannyLackey
Deactivated User

Ok.  My schema.ini file was created when I made a connection to my .csv file in Toad.  After looking at this file, I'm thinking it's not what I'm after.  Can you explain how you created your schema.ini file to suit your needs? 

0 Kudos
DannyLackey
Deactivated User

Ok, I found this example here:  Does this method apply in my situation?  http://stackoverflow.com/questions/12259562/how-can-i-create-schema-ini-file-i-need-to-export-my-csv...

0 Kudos
DanPatterson_Retired
MVP Emeritus

and look athe link inside that link Importing CSV file into Database with Schema.ini - Creating Schema.ini file dynamically or at run-ti...

and the requirements in there

Points to remember before creating Schema.ini

1. The schema information file, must always named as 'schema.ini'.

2. The schema.ini file must be kept in the same directory where the CSV file exists.

3. The schema.ini file must be created before reading the CSV file.

4. The first line of the schema.ini, must the name of the CSV file, followed by the properties of the CSV file, and then the properties of the each column in the CSV file.

and it is a text file... so create in a text editor, and explore with different options until you get it right.  You can't foul up the csv file in any event... and even if that happened you would have your backup

DannyLackey
Deactivated User

Something isn't working...

So, after having created the Schema.ini file containing this info:

[test.csv]

ColNameHeader=True

DateTimeFormat=dd-MMM-yyyy

Format=CSVDelimited

Col1=A Long

Col2=B Long

Col3=C DateTime

Col4=D DateTime

Col5=E Double

My conversion script is suddenly failing.  As soon as I remove this Schema.ini file, it runs fine.  Guessing there is something wrong with my Schema.ini file.  Is it clear where I went wrong?  It's in the same directory as the csv.

0 Kudos