In general, no, it is not "normal" for Pro to take 20+ minutes to calculate a field with that many records. That said, any discussion on the performance of calculating fields starts with sharing what the data source is because calculating fields with a local file geodatabase vs enterprise geodatabase vs hosted feature layer all perform very different.
That doesn't surprise me. I am guessing if you dumped the data to a FGDB and tested the same calculation, it would be < 30 seconds. If you are working with a version that has a very long/deep state tree, it can lead to significant performance degradation for activities like updating data. I would speak to someone who manages your EGDB and have them look at how many versions are outstanding and see if those versions can be reconciled and posted.
What is the RDBMS type and version being used?
Is the Pro client in the same location as the EGDB?
Is there a relationship class, archiving, editor tracking involved on the feature class?
What is the calculation that is being performed?