AnsweredAssumed Answered

insertion of concatenated records

Question asked by tmw2 on Mar 4, 2014
Latest reply on Mar 10, 2014 by tmw2
Hello :)

I'm currently trying to load as simply as possible records of the form: record = 'value1, value2, ... ,value17' into a layer.

so far i export my data as csv files and import them using  arcpy MakeXYEventLayer_management and FeatureClassToFeatureClass_conversion (then append them to some layer).

arcpy.MakeXYEventLayer_management(my_csv, "longitude", "latitude", xy_layer, my_sr) arcpy.FeatureClassToFeatureClass_conversion(xy_layer, my_gdb, "new_fclass", fieldmappings) arcpy.Append_management(my_gdb+"\\new_fclass", otherTable, "TEST", "", "")


I now try to improve the loading speed and get rid of the csv file. And that should do the trick:

fieldmappings = arcpy.FieldMappings() fieldmappings.addTable(otherTable)  cursor = arcpy.InsertCursor(otherTable) pnt = arcpy.Point()  for row in data: # from cx_oracle.cursor  feat = cursor.newRow()  vals = row.split(",")  # spatial reference change?  pnt.X = float(vals[9]) # long  pnt.Y = float(vals[8]) # lat  feat.SHAPE = pnt    for i in range(fieldmappings.fieldCount):   feat.setValue(fieldmappings.getFieldMap(i).name, vals[i])    cursor.insertRow(feat)



Do I have a way to do it in a cleaner way?
Can I have better results by inserting my records into a temporary ValueTable or use an Editor? (and where to use the fieldmappings in this case?)

would ArcSDESQLExecute or "in_memory" temp table be faster solutions?

Outcomes