SDE on Oracle

1096
5
08-17-2012 05:46 AM
ScottBlankenbeckler
Occasional Contributor
I have serveral processes that update tables in SDE. If I run two processes the times seem normal. However, when I add the third process the times for all processes double.  When viewing the database I see that there appears to be an issue with each process stepping on each other to write to the Last Modified table.  Since I was not the person who set up SDE on this database I am not sure what the configuration looks like.

So my question is ... Is this normal? And if not where should I look to correct it?
0 Kudos
5 Replies
VinceAngelo
Esri Esteemed Contributor
What version of ArcSDE are you using?  What service pack is installed?  Any patches?

What application(s) are you using to run updates?  What version & SP on the app(s)?

What kinds of updates are being performed?  Which table is experiencing contention?

In general I try to avoid parallel inserts or updates to a single table to avoid the kinds
of behavior you describe.

- V
0 Kudos
ScottBlankenbeckler
Occasional Contributor
ArcSDE 10.1. Don't believe there any service packs.
Oracle 11g. Patches? not sure but should be current on all patches.

I am running several python 2.7 scripts, each updates its own set of tables, no two are identical. The table that seems to be in conflict is one of the SDE housekeeping tables, GDB_TABLES_LAST_MODIFIED.

Example:
Script 1
Tables - Orders_v1, Routes _v1

Script 2
Tables - Orders_v2, Routes_v2

Script 3
Tables - Orders_v3, Routes_v3
0 Kudos
VinceAngelo
Esri Esteemed Contributor
So the application is arcPy (ArcGIS 10.1).  A description of what the scripts are
doing might be useful (especially of you want to pursue this with Tech Support).

GDB_* tables are not *ArcSDE* housekeeping tables, but geodatabase repository
tables.  It's hard to tell without looking at both source and a trace if it's your app
or arcPy itself which is generating multiple locked update requests.  If you don't
run your updates in a transaction, you should certainly do that, and hope that
arcPy won't update the modified time until COMMIT TRANSACTION.

- V
0 Kudos
ScottBlankenbeckler
Occasional Contributor

So the application is arcPy (ArcGIS 10.1). A description of what the scripts are
doing might be useful (especially of you want to pursue this with Tech Support).


Yes its an arcpy script. Each script is identical with a slight modification in the data.  I am running a vehicle routing problem with the service times being different between each script. After the solve is complete I append the orders layer to a table in SDE called Orders_v1 (etc). The script then moves to the next area in the list and processes it then appends and moves on.

Script 1 has service time values of 15% A and 85% B
Script 2 has service time values of 50% A and 50% B
Script 3 has service time values of 100% A

We then compare the output to determine viability of the given plan.  Each run consist of about 1000 area with order counts ranging from 10 to 1500.

The commands I am using are
Orders = "VPR\\Orders"
Routes = "VRP\\Routes"
SDE_Routes = "Database Connections\\DATABASE\\account.Routes_v1" #Actual database name and account removed
SDE_Orders =  "Database Connections\\DATABASE\\account.Orders_v1" #Actual database name and account removed
arcpy.Append_management(Routes, SDE_Routes, "NO_TEST")
arcpy.Append_management(Routes, SDE_Orders, "NO_TEST")


Environment is set to auto commit after 1000 records but since most areas are less than 1000 records I am assuming it is committing after it finished each area.

When all three versions are run competing locks are created on GDS_TABLES_LAST_MODIFIED.

0 Kudos
VinceAngelo
Esri Esteemed Contributor
At this point you probably want to look at saving the intermediate products to a
file geodatabase (temporary, generated for each model run -- make sure it's a
local filesystem, not networked), then when the model  is complete, append the
results to the master table and delete the temp FGDB.

- V
0 Kudos