Mass Update Attribute Table Rows

8102
15
Jump to solution
01-25-2018 02:45 PM
ZhanShi
New Contributor

Hi~ Everyone, 

I want to mass update existing 3,000 rows in attribute table using an excel table without deleting the geometry. Any tools or methods can do that?

Thank you

0 Kudos
1 Solution

Accepted Solutions
AdrianWelsh
MVP Honored Contributor

I'm guessing the problem they would face with performing a join is that it would basically add several new fields to the existing table and not necessarily solve the problem in one-shot. Though, they can always perform a Calculate Field after and copy the data over that way, after the join.

View solution in original post

15 Replies
DanPatterson_Retired
MVP Emeritus

as long as you have a common field between the tables, use

Add Join....

But it would be a good idea to use

Excel to Table to bring the table into your geodatabase first

AdrianWelsh
MVP Honored Contributor

I'm guessing the problem they would face with performing a join is that it would basically add several new fields to the existing table and not necessarily solve the problem in one-shot. Though, they can always perform a Calculate Field after and copy the data over that way, after the join.

ZhanShi
New Contributor

Yes, you are right. Join the table creates additional fileds, and I just want to use the existing schema for update.Because I have 3,000 rows with 20+ fields, Calculate Field can not update so many things in one-shot. I am looking for some tools in ArcGIS with that function, or some good method can achieve fast update, thank you~

0 Kudos
DanPatterson_Retired
MVP Emeritus

Make the join permanent by saving to a new feature class, then just delete the redundant fields.  Deletion is way quicker than calculating.  If you need speed, you can do this externally with arcpy and numpy and/or arcpy or numpy, but I would suggest just using what you have and the tools you are familiar with, do the join, save the combined, then delete (fields) what you don't want

Sometimes it is also best just to split the data into rows to fix, rows to keep.  then extract the rows you are need updating, perform the above operation, delete extra columns then append with the rows that didn't need fixing.  Sometimes it is quicker to divide and recombine into a new incarnation.  Often people spend too much time, trying to make changing featureclasses work for them when recreating new ones in the desired structure is often quicker

ZhanShi
New Contributor

Thank you Dan, I will try this way. But it looks like no quick tool can do this...That's a pity.

0 Kudos
AdrianWelsh
MVP Honored Contributor

Zhan, is it only one field that you are using as the linking variable between the two tables? Or is it more complicated than that?

0 Kudos
ZhanShi
New Contributor

There is one common field, I want to keep the schema and just update 3000 of the the attribute rows, and there are more than 20 fields in this attribute table with 50000+ rows.

0 Kudos
AdrianWelsh
MVP Honored Contributor

I'm guessing you could use a 'simple' python script to do this update but it would be no-turning-back sort of deal (no undo button).

Maybe something like this (and I'm sure there's a way to make this more efficient.. I'm just not the most python savvy)

#import modules
import arcpy
from time import strftime

# start the timer to see how long the script take
print 'Start Script: ' + strftime('%Y-%m-%d %H:%M:%S')

# identify variables for feature class and table
workspace = 'C:/Users/Name/Documents/GISData.gdb'

fc = 'FeatureClassName'


# tbl = 'TableName' #if using a table in your geodatabase
# if not, use this for a CSV
tbl = 'C:/Users/Name/Documents/TableName.csv'


# set the workspace environment to our workspace
arcpy.env.workspace = workspace

# identify fields used for updating - only pick the ones you need
# for this example, we are only using 10 fields
fieldsFC = ['Field1', 'Field2', 'Field3',
                'Field4', 'Field5', 'Field6',
                'Field7', 'Field8',
                'Field9', 'Field10']


# Here is what we're going to do:
# One-by-one, check each row in the Feature Class against
# each row in the Table, and update if there are changes...

# first, it helps to "zero" out the arrays and the counter
fcrow = ['','','','','','','','','','']
tblrow = ['','','','','','','','','','']
counter = 0

# putting this all in a try-catch statement to catch any errors
try:

    with arcpy.da.SearchCursor(tbl, fieldsFC) as tblCursor:
        for tblrow in tblCursor:
            with arcpy.da.UpdateCursor(fc, fieldsFC) as fcCursor:
                fcrow = ['','','','','','','','','','']
                # ...need to zero out the row again...
                # in this example, I am seeing if the first column (tblrow[0])
                # matches AND if the fourth column (tblrow[3]) matches,
                # then I go through with the updating of the rows in the 
                # feature class (fc) with the table rows
                for fcrow in fcCursor:
                    if (str(tblrow[0]) == str(fcrow[0])
                        and str(tblrow[3]) == str(fcrow[3])):
                        fcrow[1] = tblrow[1]
                        fcrow[2] = tblrow[2]
                        fcrow[4] = tblrow[4]
                        fcrow[5] = tblrow[5]
                        fcrow[6] = tblrow[6]
                        fcrow[7] = tblrow[7]
                        fcrow[8] = tblrow[8]
                        fcrow[9] = tblrow[9]
                        print('Row number ' + str(fcrow[0]) + ' was updated.')
                        fcCursor.updateRow(fcrow)
                        counter = counter + 1
                        continue

# except statement to catch the errors
except Exception:
    e =sys.exc_info()[1]
    print(e.args[0])
    arcpy.AddError(e.args[0])
except arcpy.ExcecuteError:
    print(arcpy.GetMessages(2))


# how many rows were updated?
print 'Updated ' + str(counter) + ' rows.'
# end the timer to see how long the script took
print 'Finshed Script: ' + strftime('%Y-%m-%d %H:%M:%S')‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍‍
XanderBakker
Esri Esteemed Contributor

MidnightYell2003 , for speed, it is better to create a dictionary with the values and use a single update cursor to update the relevant records. I would also recommend to import the Excel to a fgdb table first and not use csv or Excel. Make sure that you don't have to guess the data type.