Select to view content in your preferred language

High Precision Date Field Issue when Registering Tables

3858
15
12-06-2023 11:12 AM
RyanUthoff
Frequent Contributor

I've encountered an issue with registering tables now that the new "high precision" datetime field is available in 3.2/11.2, where when testing an identical, empty table and field, occasionally ArcGIS Pro registers it as a regular "low precision" datetime field, and the rest of the time Pro considers it a "high precision" datetime field. This is using MS SQL Server (DATETIME type, not DATETIME2(7)).  We do not currently wish to employ the "high precision" datetime type.

When creating a table within ArcGIS Pro 3.2, we can create a datetime field and then migrate that field to high precision using the Esri tool afterwards. That's not an issue.

However, when we register pre-existing tables created through SSMS using the Register with Geodatabase tool, we encounter some weird behavior. We have done several tests with multiple databases and tables, and sometimes Pro registers the table as a regular datetime field, and sometimes it registers it was a "high precision" datetime field.

We performed these tests on the exact same tables in multiple databases, and it is using the regular datetime field type in SQL Server. And sometimes Pro registers it as "high precision", and other times it does not.

This is very problematic for us because there is no way (to my knowledge) to "downgrade" to a non-precision datetime field in Esri, and it is only compatible with ArcGIS Enterprise 11.2, meaning we can't publish feature services containing "high precision" datetime fields to lower Enterprise versions.

Is there anyway we can "force" ArcGIS Pro to recognize a field as non-high precision when registering a table with the geodatabase? Or is this a bug or something, because we need Pro to be consistent when registering tables with the geodatabase and not just randomly picking and choosing to use "high precision" or not.

15 Replies
RyanUthoff
Frequent Contributor

Just to update anyone who might come across this issue. I have found a way where you can "un-migrate" or "downgrade" back to a "low precision" date field. In SSMS, navigate to the GDB_ITEMS table. Query it and find the table you need to remove "high precision" from. There is a Definition column in that table that stores the XML of the table. That is where it contains "high precision" information. You can use the following query to remove "high precision":

UPDATE [dbo].[GDB_ITEMS]
SET Definition.modify('delete (/DETableInfo/GPFieldInfoExs/GPFieldInfoEx/HighPrecision)')
WHERE ObjectID = (your table ObjectID)

Note that you might need to modify the query if your table has multiple date fields with "high precision."

I'm sure this is definitely not the recommended way of doing this, but we don't have much other choice when ArcGIS Pro randomly assigns date fields as "high precision" when registering tables with the geodatabase. 

AnnieJames21
New Contributor

Thanks for checking out the new feature with high precision datetime field. As you mentioned, once the registered class is migrated to high precision Date there is no supported way to downgrade. Inconsistent behavior though is not what we want. If the input to the Register with Geodatabase GP tool is the direct database table (i.e. you browse to the table in the tool) then the Date field will be described as high precision and will register with high precision. If you want the resulting Date field to be low precision, use the back stage option to map field types to pre 3.2 field types(Project backstage Options -> Map and Scene -> Add Layers and Tables -> Under Query Layers and text files label), check the option, create new query layer, confirm the Date field is described as low precision date and use the newly created query layer as input to the GP tool to register the class with the geodatabase.
Documentation: https://pro.arcgis.com/en/pro-app/latest/help/mapping/properties/default-settings-for-new-maps-and-s...

 

RyanUthoff
Frequent Contributor

Thank you for your reply. After thorough testing of more than 20 DBs, using the Register with Geodatabase GP tool does NOT always register tables as high precision. It is random. Furthermore, per the Esri documentation you provided, it says "these fields may be assigned to the new field types." It does not say that it definitively will get assigned, but that it may be assigned. I find that incredibly frustrating because the documentation itself is stating that it might make a field high precision when registering a table for literally the exact same field type.

While I appreciate that Esri does have an option to opt-out of these new field types, I find it frustrating that this documentation is hidden away in some other random documentation and not included in the documentation that actually talks about these new field types.

But more importantly, it's frustrating that Pro is not being consistent when registering these tables which could cause a critical failure for us (that we have already encountered), such as not being able to publish feature services on versions prior to 11.2, without an advanced "unsupported" fix or just scrapping the table/DB and trying to register again hoping it doesn't create it as high precision.

AndreaGalligari
Occasional Contributor

Only Esri is capable of delivering stochastic software.

0 Kudos
GIS_Spellblade
Frequent Contributor

Do you have a current support ticket with Esri; if so, is there a registered defect or enhancement that our organization can attach themselves to? We re-created a registered view and it automatically cast the date field (which is not high-precision in the feature class the view is based off of) as high precision.

 

For anyone else that stumbles across this, the above workaround did not solve our view issue, we had to delete and re-register with an ArcGIS Pro below version 3.2.

0 Kudos
RyanUthoff
Frequent Contributor

I have not submitted a support ticket yet because I have not been able to reliably reproduce this issue. More often then not, Esri will not make the field high precision. But every once in awhile, it does and that makes it hard to present to Esri as a bug if I can't reliably reproduce it to them.

But thank you for posting your issue. I am able to reproduce it specifically with the views like you mentioned. I was previously only testing it with tables which make it hard to reproduce. But I have a question. In the register tool, are you selecting the view directly from your DB connection? Or are you making a query layer first and then selecting the view from the table of contents in the register tool.

I CAN reproduce your issue when I am selecting the view directly from the DB. But I can NOT reproduce the issue if I make a query layer first. And that is what Annie said above in a reply to my post.

With what I posted above in this comment, it is technically not a bug because it appears to be working by design, but goodness this is a terrible implementation design choice. Having to rework all of our python scripts to make query layers out of all of the views we need to register in order for Esri to not make a field not high precision is just insane and unnecessary.

0 Kudos
GIS_Spellblade
Frequent Contributor

Our workflow has been as follows:

  1. Create a View using CREATE OR ALTER VIEW syntax (within the GP tool and also directly against the geodatabase in SSMS)
  2. Run Register with Geodatabase GP Tool, selecting the view from the appropriate SDE connection

The view we are using combines a registered point feature class with three other registered tables within the same geodatabase. We are not utilizing query layers.

I'm currently running trouble-shooting to prepare for a support ticket submission to see if I can force the conversion by dictating the date format within SSMS utilizing CAST and CONVERT.

0 Kudos
RyanUthoff
Frequent Contributor

Thank you for your comment. To my understanding, in order for Pro to not make that field high precision, it MUST be a query layer first, then you will select that table/view from the table of contents within the register GP tool.

I am able to reproduce your issue using your workflow. However, it works fine if I create a query layer first and use the query layer in the GP tool and NOT the table/view directly from the SDE connection.

In this case, I think it's technically not a bug, but certainly not a good implementation of this new field type.

Unfortunately, this design choice has negative implications for us because that means all of our Python scripts that register tables/views has to be reworked to create query layers for everything first......

0 Kudos
GIS_Spellblade
Frequent Contributor

I've found a SQL-based workaround, the date field by default (at least for editor tracking) is a datetime2(7); so you can run a CONVERT or CAST to appropriately downscale the precision.

 

 

CONVERT(datetime2(0), yourschema.last_edited_date) AS last_edited_date
CAST(yourschema.last_edited_date AS datetime2(0)) AS last_edited_date
-- by default the datetime2(7) is generated within a geodatabase that has a precision of 7

 

EDIT

Spoke way too soon. After registering the field is still reading as high precision even though the type is set to datetime2(0)