sde_logfile_data duplicate key error

2342
11
03-18-2014 05:53 AM
Highlighted
New Contributor II
Greetings,

I am running ArcGIS Server 10 on Linux using PostgreSQL 9.0.  Every time I try and update a feature service I get an error in ArcGIS Server that only says "A database error occurred.".  I looked at the PostgreSQL log files and found the following log entry:

ERROR: duplicate key value violates unique constraint "sde_logfile_data_pk"
DETAIL: Key (logfile_data_id, sde_row_id)=(23432, 686) already exists.
STATEMENT: INSERT INTO sde.sde_logfile_data (logfile_data_id,sde_row_id) VALUES (23432, $1)

These 3 lines show up numerous times and I am not sure how to fix this.  The feature service was running fine for a while, as there are over 1000 features currently in the database.  So I know things were OK for a while. 

Thanks,
Mark
0 Kudos
11 Replies
Highlighted
MVP Regular Contributor
Greetings,

I am running ArcGIS Server 10 on Linux using PostgreSQL 9.0.  Every time I try and update a feature service I get an error in ArcGIS Server that only says "A database error occurred.".  I looked at the PostgreSQL log files and found the following log entry:

ERROR: duplicate key value violates unique constraint "sde_logfile_data_pk"
DETAIL: Key (logfile_data_id, sde_row_id)=(23432, 686) already exists.
STATEMENT: INSERT INTO sde.sde_logfile_data (logfile_data_id,sde_row_id) VALUES (23432, $1)

These 3 lines show up numerous times and I am not sure how to fix this.  The feature service was running fine for a while, as there are over 1000 features currently in the database.  So I know things were OK for a while. 

Thanks,
Mark


It looks like something got hosed up with your SDE_LOGFILE_DATA table, which is one of the tables involved when you make a selection of 1,000 or more features using a geodatabase connection.  While not necessarily common of an error, I can't say that I'm surprised because I've seen issues before with the cleanup of rows in that table by ArcSDE.  In my mind, there is a short-term fix and a long-term fix to this issue.  I know you're using Postgres, but I'm not as familiar with that RDBMS so I'm going to provide an Oracle-based solution. 

The short-term fix is to truncate both the SDE_LOGFILE_DATA and the SDE_LOGFILES tables and start over with your session:
TRUNCATE TABLE SDE.SDE_LOGFILES;
TRUNCATE TABLE SDE.SDE_LOGFILE_DATA;


The long-term fix is to drop those tables altogether and re-create them as Global Temporary tables (or the equivalent in Postgres):
DROP TABLE SDE.SDE_LOGFILE_DATA CASCADE CONSTRAINTS;
DROP TABLE SDE.SDE_LOGFILES CASCADE CONSTRAINTS;

CREATE GLOBAL TEMPORARY TABLE SDE.SDE_LOGFILE_DATA
   (
    logfile_data_id NUMBER(*,0) NOT NULL ENABLE,
    sde_row_id NUMBER(*,0) NOT NULL ENABLE
   )
ON COMMIT PRESERVE ROWS
/
CREATE INDEX SDE.SDE_LOGFILE_DATA_IDX1
    ON SDE.SDE_LOGFILE_DATA(LOGFILE_DATA_ID,SDE_ROW_ID)
/
CREATE INDEX SDE.SDE_LOGFILE_DATA_IDX2
    ON SDE.SDE_LOGFILE_DATA(SDE_ROW_ID)
/
GRANT DELETE ON SDE.SDE_LOGFILE_DATA TO SDE
/
CREATE GLOBAL TEMPORARY TABLE SDE.SDE_LOGFILES
  (
    logfile_name VARCHAR2(256) NOT NULL ENABLE,
    logfile_id NUMBER(*,0) NOT NULL ENABLE,
    logfile_data_id NUMBER(*,0) NOT NULL ENABLE,
    registration_id NUMBER(*,0) NOT NULL ENABLE,
    flags NUMBER(*,0) NOT NULL ENABLE,
    session_tag NUMBER(*,0) NOT NULL ENABLE,
    logfile_data_db VARCHAR2(32),
    logfile_data_owner VARCHAR2(32),
    logfile_data_table VARCHAR2(98),
    column_name NVARCHAR2(32)
   )
ON COMMIT PRESERVE ROWS
/
CREATE UNIQUE INDEX SDE.SDE_LOGFILES_PK
    ON SDE.SDE_LOGFILES(LOGFILE_ID)
/
CREATE UNIQUE INDEX SDE.SDE_LOGFILS_UK
    ON SDE.SDE_LOGFILES(LOGFILE_NAME)
/
CREATE UNIQUE INDEX SDE.SDE_LOGFILES_UK2
    ON SDE.SDE_LOGFILES(LOGFILE_DATA_ID)
/
GRANT DELETE ON SDE.SDE_LOGFILES TO SDE
/


Oracle Global Temporary tables, by definition, will truncate themselves when the user disconnects its session.  I believe there may be an equivalent type of table Postgres.
0 Kudos
Highlighted
New Contributor II
Thanks for getting back to me.  I am using a PostgreSQL 9.0 database.  It was an EnterpriseDB install on a linux machine.  Will these same steps work, just changing the SQL syntax possibly?
0 Kudos
Highlighted
Esri Esteemed Contributor
This is probably an issue for Tech Support, but you can clear the issue by truncating
the logfile table pair (which will require that no one be logged in as the owner).

Note that "Best Practice" dictates that you should NEVER use the SDE user to own
spatial data, and you should only connect as SDE if you are doing geodatabase
maintenance.  Instead you should always create additional users to own and manage
spatial data, and also create non-owner users for managing access to the spatial
data.  Permissions should be granted through roles.

- V
0 Kudos
Highlighted
MVP Regular Contributor
Thanks for getting back to me.  I am using a PostgreSQL 9.0 database.  It was an EnterpriseDB install on a linux machine.  Will these same steps work, just changing the SQL syntax possibly?


I had modified my comments above right before you posted this comment.  But, after some further review, I believe I was slightly off course with regard to the global temp tables in Postgres.  In Postgres, all temp tables will be accessible only to that session and will be deleted when that session is closed. You can have the GLOBAL in the CREATE TABLE statement as I described above but it will be ignored by the database.  Postgres requires each session to issue its own CREATE TEMPORARY TABLE command for each temporary table to be used, thus creating the temp table "on the fly" rather than ahead of time.  I apologize for mentioning the long-term solution above since it likely won't work for Postgres.  That being said, you can go with the short-term solution I mentioned above for truncating the tables (which Vince also seems to suggest) and see if that fixes your problem.
0 Kudos
Highlighted
New Contributor II
I truncated (cascade) both the sde_logfile_data and sde_logfiles tables.  I then restarted PostgreSQL, GeoEvent Processor and ArcGIS Server I still get the same errors.  The ID's that are shown in the logs are different but otherwise its the same errors.
0 Kudos
Highlighted
MVP Regular Contributor
I think what is happening is that there are multiple connections to your service using the same database user account, and each of those sessions (connections) are trying to select some of the same records and insert them into the same logfile tables.  What you need to do is change your ArcSDE configuration to utilize session-based Logfiles.  My guess is that you are using the Shared logfile configuration, which is the default for Postgres databases.  Session-based logfiles are dedicated to a single session (rather than a specific user account) and may contain multiple selection sets (log files).  Each session that logs in requires a set of tables for selections.  You should use session-based log files if you have numerous concurrent connections being made to the geodatabase with the same login / user account.  You'll likely want non-pooled session-based logfiles, to be more specific. 

The following links describe the various ways of configuring how logfiles are handled in ArcSDE (e.g., Shared vs. Session-Based vs. Stand-Alone):

help.arcgis.com/en/arcgisdesktop/10.0/help/index.html#/Log_file_table_configuration_options_for_geod...

http://help.arcgis.com/en/arcgisdesktop/10.0/help/index.html#/Log_file_tables/002p00000081000000/
0 Kudos
Highlighted
Esri Esteemed Contributor
The session ids should be different between logins, so this error should not be possible.
There is some characteristic of your install which is causing this error to recur.  I suggest
you contact Tech Support, so they can try to discover the root problem.

- V
0 Kudos
Highlighted
MVP Regular Contributor
The session ids should be different between logins, so this error should not be possible.
There is some characteristic of your install which is causing this error to recur.  I suggest
you contact Tech Support, so they can try to discover the root problem.

- V


Vince, what I'm saying is that Shared log files are shared by all sessions that connect as the same user which might very well be the case here.  If you have multiple users connecting with the same user account, all those sessions insert records into the same log file data table.  The session IDs being different wouldn't matter, I don't think, because it's irrespective of session ID.  For Session-based log files, these are dedicated to a single session and that is where the session ID comes into play.  I think he is currently set up with Shared log files and the connections to his feature service all use the SDE account.  I might be misunderstanding you altogether, but I think if he switches to Session-based log files then this error would go away.
0 Kudos
Highlighted
New Contributor II
I'm new to ArcGIS Server / Geodatabases, so I'm sorry this has been a mess.  All I am trying to do is receive a data feed using GeoEvent Processor and update a Feature Class in ArcGIS Server.  Here's my data flow:

JSON Data -> GeoEvent Processor -> Update Feature Service -> Query information in ArcGIS Server via REST

So I'm not sure where the multiple database connections are coming from, if GEP is pooling connections or something else.  Since GEP is just doing a REST push to ArcGIS Server there is no connections to the DB other than ArcGIS Server.  I followed the instructions at http://help.arcgis.com/en/arcgisdesktop/10.0/help/index.html#/Log_file_table_configuration_options_f... and changed the value of ALLOWSESSIONLOGFILE to 1.  The other 2 values that were suggested that I change were already set to the values specified in the link.

After updating ALLOWSESSIONLOGFILE, I restarted ArcGIS Server, PostgreSQL and GeoEvent Processor.  I am still getting an error message:

ERROR: duplicate key value violates unique contraint 'sde_session_pk305'
DETAIL: Key (logfile_data_id, sde_row_id)=(24140,899) already exists.
STATEMENT: INSERT INTO sde.sde_session305 (logfile_data_id,sde_row_id) VALUES (24140, $1)

Thanks for the continued support,
Mark
0 Kudos