Select to view content in your preferred language

Update attribute on large dataset stored in AGOL

587
1
11-10-2021 12:51 AM
StuartMoore
Frequent Contributor

so i have a point dataset stored in AGOL as a feature service containing a little over 12 million rows and i need to update one field based on some criteria

my initial thought was to create a script to do it in Notebooks in AGOL

updCount =0
while True:
    try:
        updSym = slry.query(where="PROV_src='aaa' and Symbology is null", return_all_records = False, result_record_count = 2000)
        print(len(updSym))
        if len(updSym) ==0:
            break
        else:
            for x in updSym:
                #loops all returned records and updates the symbology field to 5
                print("ObjectID:- {0} -- {1}".format(x.attributes['OBJECTID'],x.attributes['PROV_SRC']))
                x.attributes['Symbology'] = 5
            #commits the edits
            res = slry.edit_features(updates = updSym.features)
            resstr = str(res)
            
            #counts the number of success = true in the results and adds it to the completed ones
            updCount = updCount + resstr.count("'success': True")
            print("updated:- {0}".format(updCount))
    except:
        print("error")

the script above if just for one of the criteria and it does work but its slow i think its running about 20 seconds per 2000 rows...

its running the notebooks as standard and not advanced not sure if that makes a difference

is there a better way short of updating the original FGDB locally on my laptop and re-publishing it

Stu

 

0 Kudos
1 Reply
StuartMoore
Frequent Contributor

spent the day trying to do it via a replica but there seems to be no way to upload the data back into AGOL to sync 😞

0 Kudos