Hello! I am running Calculate Field on an attribute table field in a sde EGDB feature class. The feature class was joined to a feature service. I am pulling values from the service field (JOIN table) to the corresponding field in the INPUT table of the EGDB feature class to update the current data. There are 17,538 records. The process takes less than 10 minutes. I noticed of the 17,538 records, 11,941 records now show <Null> in the field, where valid data once existed. As a test I selected 290 records having null values in the field and ran calculate field. The process completed and the field in the Input table was updated with the correct information from the Join field.
Is there a limit on the number of records the calculate field tool can complete, and the rest of the records receive a null value? I am using Pro 3.4.3. One child version on the feature dataset.
Many thanks in advance, Jay
How many rows are calculated? for many Feature Services the maximum number of rows that can be returned by the query is defined in the service. ESRI default is 1000 rows.
To test, in Python suggest you do a loop or add code to 'chunk' the calculation by OBJECT ID or FID (i.e., calc FID 0 to 999 then repeat for 1000-1999, until your done. Then check and see if the null has been solved.
Could be an issue with the formula you used.
For example, if you have an if statement and don't provide an explicit else option, it will assume that the "Else" is None.
calc(!Field1!)
def calc(field1):
if "a" in field1:
return "Road"
But if you give it an Else, it does fine. This example returns the current value if the "if" doesn't match.
calc(!Field1!, !Field2!)
def calc(field1, field2):
if "a" in field1:
return "Road"
else:
return field2
Hello, thank you for your help.
I am using the Calculate Field tool 'as is,' Python expression, but no additional code block.
The joined table comes from a feature service from another agency (200K+ records). The input table comes from a feature class in our EGDB, having about 17,000+ records, so 17,000+ of the joined table records are joined to the input table. I assumed once joined, we can operate in 'isolation' on those features alone, and not be subject to server or other limitations. I don't know if the agency has any sort of limit on the number of returned records on their side. The feature service is quite responsive when added to a map in Pro or a web map. Again, thank you for your help in this matter. Regards, Jay
Maybe I'm misunderstanding, but I think it's the same issue regardless.
In this example, I've joined the destination table to the origin table based on the "joinField" field. I want to populate the "ValueToPopulate" field in the origin table with values from the "ValueToCopy" field in the destination table.
There are 6 records in the origin table and 5 in the destination table. The 6th record in the origin table, which is affected by the join, already has a value in the field I want to populate.
I perform the join.
Then I open the Field Calculator and input the following:
!origTable.ValueToCopy!
And hit apply
My value in pre-populated row was overwritten with null.
To solve this, you have two primary options:
1) Use a codeblock with an if statement, as detailed in my earlier post
2) Select the records you want to populate via eitherof these two methods:
Field Calculator will only operate on what's present or selected (Selection takes priority).
Hi Alfred,
Thank you for this. I am doing what you demonstrated regarding joining and calculating. I manually selected batches of rows and field calculating on just those selected (priority as you mentioned). When I select a larger number of records, say for example, 3,000+, is when I see the behavior of not completing the calculations in all 3,000+ records. Richard's post is worth exploring with the agency that owns the feature service. Regards, Jay