Select to view content in your preferred language

Publishing large scene layers from S3

2590
7
12-02-2020 02:46 AM
AngusHooper1
Frequent Contributor

Referencing this blog https://www.esri.com/arcgis-blog/products/arcgis-enterprise/3d-gis/publishing-large-scene-layers-in-....

Is there an example python workflow or GUI based workflow for publishing the large scene layers with data in S3 in the i3sREST format? I have successfully published using this python script & workflow for data that was converted and exported to a NAS. I have not had the same luck with S3 - I receive an error 99999 when attempting to publish the service in the last step of the script. I encounter no issues with registering the S3 bucket as a cloudstore in ArcGIS Server.

I also noticed that the code specifically grabs the server ID from the hosting server however the comments note that the server can just be a federated server. Is that true?

0 Kudos
7 Replies
GarimaTiwari
Esri Contributor

Is there an example python workflow or GUI based workflow for publishing the large scene layers with data in S3 in the i3sREST format? 


We do not have a GUI or python based workflow yet. We are working on it for our future release. 


I receive an error 99999 when attempting to publish the service in the last step of the script. I encounter no issues with registering the S3 bucket as a cloudstore in ArcGIS Server.

Have you registered your S3 bucket as cloud store using ArcGIS Enterprise Portal App or ArcGIS Server Manager? This workflow would only succeed if you have registered the cloudstore using ArcGIS Enterprise Portal App. Would you please confirm this? Do you get any messages in the server logs along with error 9999? If so, please share it with us.


I also noticed that the code specifically grabs the server ID from the hosting server however the comments note that the server can just be a federated server. Is that true?


Yes, you can modify the ArcGIS API for python script to select federated server. This is supported. 

 

Best Wishes, 

Garima

0 Kudos
AngusHooper1
Frequent Contributor

This workflow would only succeed if you have registered the cloudstore using ArcGIS Enterprise Portal App.

 

This 100% could be the cause. I'll run through the workflow again today and get back with the results.

0 Kudos
AngusHooper1
Frequent Contributor

Thanks @GarimaTiwari 

 

Manually registering the data store item with Portal and Server removes the initial steps. When publishing we still hit error 99999. The server logs have the following:

Delegate job failed.
ERROR: info file in extracted cache not found. Failed to execute (Publish Datasets In Data Stores).
ERROR: info file in extracted cache not found.
Unable to access scene layer file.

I am not confident in the service_conf.

"type": "SceneServer",
"serviceName": "service name",
"properties": {
"pathInCachedStore" : " / root S3 directory / ",
"cacheStoreId": cache_store_id,

Thoughts?

0 Kudos
GarimaTiwari
Esri Contributor

The "pathInCachedStore" should be the folder where the i3sREST content is stored. This folder contains the scene layer configuration necessary for publishing.

For example: 

"pathInCachedStore" : " /rootS3directory/My_sceneContent.i3srest"

0 Kudos
AngusHooper1
Frequent Contributor

Still hitting the same error. Using your example, is this what you would expect to see in /My_sceneContenet.i3srest?

AngusHooper1_0-1607673881158.png

 

0 Kudos
TelmaFernandes2
Emerging Contributor

Hi @GarimaTiwari  

I am trying to follow this practice. I have a slpk file of 149GB, located in a folder on my server. I managed to, following python to register the folder where the file is located as DataStore in the Portal and in ArcGIS Server as well. However, when I go to the publishing step, it doesn't recognize the cache_id I'm putting in. Below is an example of what I am putting in service configuration and I am not sure if it is what is expected:

## Step 4a. Create a dictionary for your service configuration
service_conf= {
"type": "SceneServer",
"serviceName": "Mesh",
"properties": {
"pathInCachedStore" : "/fileShares/Geodata/filetopublish/Production_6_ESRI_total.slpk",
"cacheStoreId": 67df774917f34228b38e658e5f229576,
}
}
service_conf

In the "pathInCachedStore" parameter I'm putting the path to the folder where my slpk file that I want to publish is located.
In the "cacheStoreId" parameter I am putting id of my datastore that I consulted through the ArcGIS Serve Admin directory.
Can you tell me what I am configuring wrong?

0 Kudos
AngusHooper1
Frequent Contributor

Problem resolved. It was a data issue with the params we used in the data conversion / exporter utility.

0 Kudos