Hello all,
Looking for ideas on updating several attributes for about 300k records.
We are using arcgis pro 2.9.5 with utility network and branch version is enabled.
Currently I can add a query table from another database which has the data I need, then do a table join and calculate attribute. Ive done this for batches of around 10k records each time and each batch is taking about 30-60min to run.
While the above will work it is obviously quite time consuming, and as our business is still new to using arcgis we are unsure if there are more efficient methods of doing this.
Any advice or do I just bite the bullet and continue on with the table join method?
Thanks.
a couple of extra details... these are not network attributes so no dirty areas but one attribute contains a unique value for each record so cant group and update all via the attributes pane for that one.
the other two fields are grouped into about 20 different values so can possibly do that via attributes pane if I can select the right features.
Joe, take a look at this post by Richard Fairhurst (fair to say it was life changing for me)
Turbo-charging data manipulation with python
Basic premise is using an update cursor to apply a dictionary you create based on a common value between your two datasets. If you scroll down to the section titled:
"Using a Python Dictionary Built using a da SearchCursor to Replace a Join Connecting Two Feature Classes"
...there is a sample code that will help you. I suspect you'll find the speed difference incredible!
Thanks Richard, I think Ive found a method via FME but will keep this as another option.