Hello everyone,
We've been trying to export into enterprise geodatabase using local server and it seems to be throwing the following error in the local server log:
2024-12-19 17:59:46,514 INFO rid=4 'worker-4776' Core - layers/0/applyEdits (code=17002): GraphicFeatureServer::HandleRESTRequest ## requestProperties = {"computeETag":true}
2024-12-19 17:59:46,514 INFO rid=4 'worker-4776' Core - layers/0/applyEdits (code=100001): REST request received. Request size is 16354 characters.
2024-12-19 17:59:46,517 INFO rid=4 'worker-4776' Core - SetupGeoTransformation::Init (code=17002): Input Spatial Reference:: Name: 'British_National_Grid' Factory Code: 27700; Output Spatial Reference:: Name: 'Unknown' Factory Code: 0.
2024-12-19 17:59:46,517 INFO rid=4 'worker-4776' Core - SetupGeoTransformation::Init (code=17002): There is no GeoTransformation in the Server instance for the Spatial References.
2024-12-19 17:59:46,517 INFO rid=4 'worker-4776' Core - SetupGeoTransformation::Init (code=17002): There is no Default GeoTransformation in the Spatial Reference Environment.
2024-12-19 17:59:46,519 ERROR rid=4 'worker-4776' Core - GraphicFeatureLayer::Add2 (code=17000): An error occurred.
2024-12-19 17:59:46,519 ERROR rid=4 'worker-4776' Core - GraphicFeatureServer::HandleREST_ApplyEditsOperation (code=17000): An error occurred.
2024-12-19 17:59:46,520 INFO rid=4 'worker-4776' Core - layers/0/applyEdits (code=5555) (elapsed=0.005074): outBytes=195; taskName=Edit
2024-12-19 17:59:46,520 INFO rid=4 'worker-4776' Core - layers/0/applyEdits (code=100004) (elapsed=0.005074): REST request successfully processed. Response size is 195 characters.
2024-12-19 17:59:46,544 INFO rid=4 'server' MapServer-test_simple - Request handled.
One peculiar thing about this is the destination table has around 160 columns and during add (5 entries), only 2 really short strings are added but it still throws the error message above and within the code, an AggregateException is thrown with an inner exception of Exception with message "Unable to complete operation" error code 0x80004005.
We tried reducing the column count to less than 10 and it seems to be exporting successfully.
My question is, is there a limit to how many columns can we export into an enterprise geodatabase table? And if it does, how do we go about exporting into those with too much columns?
Hi
A few questions...
1/ Can you provide more information about what code/tools you're using to "export into enterprise geodatabase"? e.g. Is this a custom geoprocessing tool you've created with ArcGIS Pro (using Model Builder or Python) and is it using specific geoprocessing tools to create new tables in an enterprise geodatabase and import the data? A code sample or relevant snippets and a .gpkx file (if using) would be ideal.
2/ What versions of software are you using? (ArcGIS Maps SDK for .NET, ArcGIS Maps SDK for Local Server, ArcGIS Enterprise, database type/version)
3/ What does your local server deployment config file look like? (this is in the project folder of the project that references NuGet package Esri.ArcGISRuntime.LocalServices). I recommend setting everything to false initially, and deleting and LocalServer200.x folders in your project output folder, so that you can test against the default SDK installation.
Thanks
Thank you for the response, Michael.
1/ Can you provide more information about what code/tools you're using to "export into enterprise geodatabase"? e.g. Is this a custom .geoprocessing tool you've created with ArcGIS Pro (using M Builder or Python) and is it using specific geoprocessing tools to create new tables in an enterprise geodatabase and import the data? A code sample or relevant snippets and a .gpkx file (if using) would be ideal.
We're using the sample LocalServerServices as a basecode so the setup is ArcGIS maps runtime > ArcGIS localserver > enterprise geodatabase (MSSQL). We're using MPKX to communicate with the enterprise geodatabase. We've created the tables using arcgis pro. Layer 0 is Point type with around 160 columns and Layer 1 is Polygon with around 50 columns. Unfortunately, I couldn't share the table schema script as it's confidential.
2/ What versions of software are you using? (ArcGIS Maps SDK for .NET, ArcGIS Maps SDK for Local Server, ArcGIS Enterprise, database type/version)
We're using 200.5 for ArcGIS Maps SDK for .NET, 200.1 for ArcGIS Maps SDK for Local Server, and MSSQL for database (20.1). We used ArcGIS pro 3.4 to export the MPKX file.
3/ What does your local server deployment config file look like? (this is in the project folder of the project that references NuGet package Esri.ArcGISRuntime.LocalServices). I recommend setting everything to false initially, and deleting and LocalServer200.x folders in your project output folder, so that you can test against the default SDK installation.
We used the default sdk config but left the debug to true.
Hi Michael! I was wondering if there are any updates to this query? Thank you.
Hi,
The first thing I recommend trying is changing the deployment config file to disable the deployment of the Local Server components, which will ensure the Local Server is running from the central SDK installation and all sub-components are available.
To do this, you can either change the deployment config as follows:
And then ensure you've deleted the LocalServer200.1 folder in your project output location.
Alternatively, you can set Property InstallPath.
To double check where it's running from when you launch your app / start the local service, you can use Task Manager to check the location of the RuntimeLocalServer.exe process.
Assuming you're creating the MPKX and maintaining the references to the Enterprise Geodatabase, then ultimately, you'll also need to enable the options:
<Package id="ProSDE" name="SDE" enabled="true">
And
<Package id="ProSQLServer" name="SQL Server" enabled="true" />
Thanks
Hello,
it seems that the issue is our schema is having column names that are too long (longest column name that I saw is around 33 characters). I tried having a workaround by assigning aliases to fields but it seems that it's still throwing the same error and I looked into the JSON submitted to arcgis local server and it's still sending in the full column name.
Is there a good workaround to this?
It does look like the maximum field name length is 31 bytes - Enterprise geodatabase size and name limits—ArcGIS Pro | Documentation.