Select to view content in your preferred language

How to check on cache job status

139
5
2 weeks ago
SebastianBosbach
Occasional Contributor

Hi,

I'm trying to automate uploading my scene services.
I tried the "new" way to publish them with pyhton like in this documentation:
https://pro.arcgis.com/en/pro-app/latest/arcpy/sharing/publish.htm
The upload works fine, but I have to wait till the cache generation is finished, before I can upload additional data.
Otherwise my server will be overloaded.
Some of the services take 1 hour to complete the cache generation, other take up to 14 hours.
So I want to check the job status, the cache_job_id is even returned by the example of esri, but I cant find a way to check for this job.
Can anyone link me the right documentions or an example?

 

0 Kudos
5 Replies
ChrisUnderwood
Esri Regular Contributor

Hello @SebastianBosbach  , in ArcGIS Server Manager you can click on this View Cache Status button. Does that tell you what you need to know ?

ChrisUnderwood_0-1751386988007.png

 

0 Kudos
SebastianBosbach
Occasional Contributor

No. First the cache job status of a scene layer isn't shown there. Second I need to automate the polling of the job with python, so even if the status would be shown there, going manual to the server manager wouldn't be my solution.

0 Kudos
HaydenWelch
MVP Regular Contributor

Most arcpy passthrough functions return a Result object. Those objects have a status value.

 

Edit: I see that the sharing function returns some REST endpoints. Can you ping those to test for completion? Just have a while loop that checks for 200 status at the endpoint?

0 Kudos
SebastianBosbach
Occasional Contributor

Hi, thats exactly what I want to to.
From the documentation: "Returns a dictionary that includes the item URL, REST URL, and cache job ID."
But there is no additional info on how to use the job ID to get the job status, thats the thing I'm looking for help on.

0 Kudos
SebastianBosbach
Occasional Contributor

I now implemented a "dirty" way to check for the cache job, because I still don't know where to check for the job with the job_id provided by the publish function .

If someone else might find this thread, this is how I check if the cache job is still running:

- publish the scene layer
- get the scene layer uri
- check in loop if propertie "lastUpdate" has a value different then "-999999"
- if so, cache generation is finished
- publish the next scene layer

This script only checks on one layer of the published data ("/layers/0"), if your data would have more then one scene layer, you need to edit this.

I still would appreciate it, if someone would point me to the "right" way on checking the cache status using the job_id.

def get_token(server_url, username, password):
    url = f"{server_url}/admin/generateToken"
    data = {
        "username": username,
        "password": password,
        "client": "requestip",
        "f": "json"
    }
    r = requests.post(url, data=data, verify=False)
    r.raise_for_status()
    return r.json()["token"]


def get_layer_last_update(layer_url, token):
    params = {"f": "json", "token": token}
    r = requests.get(layer_url, params=params, verify=False)
    r.raise_for_status()
    layer_info = r.json()
    return layer_info.get("serviceUpdateTimeStamp", {}).get("lastUpdate", -999999)

#...
#generate aprx an stuff to publish data
#...

    res = arcpy.sharing.Publish(scene_draft)

    print(r"item_url: " + res["web_scene_layer"]["item_url"])
    print(r"rest_url: " + res["web_scene_layer"]["rest_url"])
    print(r"cache_job_id: " + res["web_scene_layer"]["cache_job_id"])

    timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
    print(f"[{timestamp}] Finish Publishing {bu}")

    layer_url = res["web_scene_layer"]["rest_url"] + r"/layers/0"
    token = get_token(server_url, username, password)
    token_expires = datetime.now() + timedelta(minutes=30)

    while True:
        try:

            #refresh token after 30 minutes
            if datetime.now() >= token_expires:
                token = get_token(server_url, username, password)
                token_expires = datetime.now() + timedelta(minutes=30)

            last_update = get_layer_last_update(layer_url, token)
            timestamp = datetime.now().strftime("%Y-%m-%d %H:%M:%S")
            status_text = f"[{timestamp}] 🔍 cache generation still running (lastUpdate = {last_update})"

            print(status_text.ljust(100), end="\r")  # Zeile überschreiben

            if last_update != -999999:
                print(f"\n✅ cache generation finished (lastUpdate = {last_update})")
                break

        except Exception as e:
            print(f"❗ error fetching lastUpdate: {e}")
            break

        time.sleep(60)

 

0 Kudos