I want to mass update existing 3,000 rows in attribute table using an excel table without deleting the geometry. Any tools or methods can do that?
I'm guessing the problem they would face with performing a join is that it would basically add several new fields to the existing table and not necessarily solve the problem in one-shot. Though, they can always perform a Calculate Field after and copy the data over that way, after the join.
as long as you have a common field between the tables, use
But it would be a good idea to use
Excel to Table to bring the table into your geodatabase first
Yes, you are right. Join the table creates additional fileds, and I just want to use the existing schema for update.Because I have 3,000 rows with 20+ fields, Calculate Field can not update so many things in one-shot. I am looking for some tools in ArcGIS with that function, or some good method can achieve fast update, thank you~
Make the join permanent by saving to a new feature class, then just delete the redundant fields. Deletion is way quicker than calculating. If you need speed, you can do this externally with arcpy and numpy and/or arcpy or numpy, but I would suggest just using what you have and the tools you are familiar with, do the join, save the combined, then delete (fields) what you don't want
Sometimes it is also best just to split the data into rows to fix, rows to keep. then extract the rows you are need updating, perform the above operation, delete extra columns then append with the rows that didn't need fixing. Sometimes it is quicker to divide and recombine into a new incarnation. Often people spend too much time, trying to make changing featureclasses work for them when recreating new ones in the desired structure is often quicker
Thank you Dan, I will try this way. But it looks like no quick tool can do this...That's a pity.
Zhan, is it only one field that you are using as the linking variable between the two tables? Or is it more complicated than that?
There is one common field, I want to keep the schema and just update 3000 of the the attribute rows, and there are more than 20 fields in this attribute table with 50000+ rows.
I'm guessing you could use a 'simple' python script to do this update but it would be no-turning-back sort of deal (no undo button).
Maybe something like this (and I'm sure there's a way to make this more efficient.. I'm just not the most python savvy)
#import modulesimport arcpyfrom time import strftime# start the timer to see how long the script takeprint 'Start Script: ' + strftime('%Y-%m-%d %H:%M:%S')# identify variables for feature class and tableworkspace = 'C:/Users/Name/Documents/GISData.gdb'fc = 'FeatureClassName'# tbl = 'TableName' #if using a table in your geodatabase# if not, use this for a CSVtbl = 'C:/Users/Name/Documents/TableName.csv'# set the workspace environment to our workspacearcpy.env.workspace = workspace# identify fields used for updating - only pick the ones you need# for this example, we are only using 10 fieldsfieldsFC = ['Field1', 'Field2', 'Field3', 'Field4', 'Field5', 'Field6', 'Field7', 'Field8', 'Field9', 'Field10']# Here is what we're going to do:# One-by-one, check each row in the Feature Class against# each row in the Table, and update if there are changes...# first, it helps to "zero" out the arrays and the counterfcrow = ['','','','','','','','','','']tblrow = ['','','','','','','','','','']counter = 0# putting this all in a try-catch statement to catch any errorstry: with arcpy.da.SearchCursor(tbl, fieldsFC) as tblCursor: for tblrow in tblCursor: with arcpy.da.UpdateCursor(fc, fieldsFC) as fcCursor: fcrow = ['','','','','','','','','',''] # ...need to zero out the row again... # in this example, I am seeing if the first column (tblrow) # matches AND if the fourth column (tblrow) matches, # then I go through with the updating of the rows in the # feature class (fc) with the table rows for fcrow in fcCursor: if (str(tblrow) == str(fcrow) and str(tblrow) == str(fcrow)): fcrow = tblrow fcrow = tblrow fcrow = tblrow fcrow = tblrow fcrow = tblrow fcrow = tblrow fcrow = tblrow fcrow = tblrow print('Row number ' + str(fcrow) + ' was updated.') fcCursor.updateRow(fcrow) counter = counter + 1 continue# except statement to catch the errorsexcept Exception: e =sys.exc_info() print(e.args) arcpy.AddError(e.args)except arcpy.ExcecuteError: print(arcpy.GetMessages(2))# how many rows were updated?print 'Updated ' + str(counter) + ' rows.'# end the timer to see how long the script tookprint 'Finshed Script: ' + strftime('%Y-%m-%d %H:%M:%S')
Adrian Welsh , for speed, it is better to create a dictionary with the values and use a single update cursor to update the relevant records. I would also recommend to import the Excel to a fgdb table first and not use csv or Excel. Make sure that you don't have to guess the data type.
Maybe I should explain more about the python script listed above.
I have a set of parcels with different attributes that need to be updated weekly. The engineer has an excel sheet, that I initially generated from the attribute table, that he updates various fields for each parcel of land. To uniquely identify each land parcel, I have to match up two attributes. What the python script does is, see if the two attributes match up, then use an update cursor to change the fields.
I use a CSV file since I get an excel file each week and just save it as a CSV and run it from there. Now, I do not know enough about python and other such things to make it more efficient but this works for me. I am not sure if it would be speedier to take the excel table to a geodatabase each time versus just a CSV since it's normally about 200 records.
So, what I'm saying is, this exact script would not necessarily work for the OP but it's similar to what I use for having to match up two fields (without creating a new field that concatenates the other two, etc.). I hope that makes sense.
Thank you very much, but I am not familiar with Python script, I have plan to learn more about it.
excel to table
join table to fc
calculate field using new field in the join
Joe, the calculate field thing was mentioned earlier. The OP said:
Because I have 3,000 rows with 20+ fields, Calculate Field can not update so many things in one-shot.
Just an update, finally, I used field calculator to update each filed after joining the two tables. It takes long to calculate 3000+ records, but it works. Thank you for all your input~
I'm glad to hear that using field calculator essentially solved your issues!
Retrieving data ...