I noticed that when using arcpy.CalculateField_management in python on a feature class it's changes the field scale (decimal places). I had a shapefile with an Acres filed of field scale of 3 but now it seems to have a field scale of 11. Why after running arcpy.CalculateField_management does it change the field scale? This is annoying after your export the table and open it.
I could not reproduce this with a shapefile. However, for a File Geodatabase feature class, the Precision and Scale are ignored. See the following link.
When you create a float, double, or integer field and specify 0 for precision and scale, the tool will attempt to create a binary type field if the underlying database supports it. Personal and file geodatabases support only binary type fields, and precision and scale are ignored.
At first you state that you are using a feature class, then a shapefile, and then at some point you are performing an export. Can you explain your workflow a little more?
Also, what version of the software are you using, including service packs? i.e. 10.1 SP1
The current script is ran in a temp gdb, i decided to run the script on a shapefile for testing because of the decimal issues i was having. After the script is ran i export the table and use the table in excel.
Before running the CalculateField_management my field scale is set to 3, example Acres = 1.236 But after running CalculateField_management my field scale is set to 6, example Acres = 1.236789
I could not reproduce this with a shapefile. However, for a File Geodatabase feature class, the Precision and Scale are ignored. See the following link.
When you create a float, double, or integer field and specify 0 for precision and scale, the tool will attempt to create a binary type field if the underlying database supports it. Personal and file geodatabases support only binary type fields, and precision and scale are ignored.