Select to view content in your preferred language

Timeout when Loading

899
11
01-11-2011 01:59 PM
TedCronin
MVP Honored Contributor
Is there a timeout when trying to load large datasets to ArcGIS.com?  I can consistently script small datasets to load, but if I try and script a large dataset, it is failing.
Tags (2)
0 Kudos
11 Replies
MikeMinami
Esri Notable Contributor
How big is large?

Mike
0 Kudos
TedCronin
MVP Honored Contributor
Dude.  Thanks Mike for getting back to me.

400 MB.  I can easily load 2 11MB files.  I am using the arcpy.SharePackage method, so this may be an issue with GP, or Mapping, I may post to all three since I am not sure where this will fall.  I am also seeing some other strange limitations.  I have scripts running right now, that hopefully will allow me better insight to the issues, at least a starting point for you guys to consider which ever team I intend to go after or post issues into on these forums.  Some really peculiar behavior that may or may not be expected.

This issue of a large file, when done manually used to work at 10 final (NO SP), since SP and any Online maintenance on the AGO side, not sure, so maybe there is an issue there as well.  These questions will be looked into a bit further tomorrow, most definitely.
0 Kudos
MikeMinami
Esri Notable Contributor
Yes, 400MB is probably timing out on you. But it's a web tier time out.  The problem is that the post upload processing that we do to the file takes longer than the time out, but it still gets processed in our experience. If you come back in say 20 minutes, do you see the item in "My Contents"? If not, there could be some other error that you don't see that is preventing it from getting processed correctly

Thanks,

Mike
0 Kudos
TedCronin
MVP Honored Contributor
I do see a file in My Contents, so what you are saying is it is a valid file lpk.  My issue then is, if I have a few lpks, and one happens to be large, I need to make sure that the largest is processed last, so that when I get a general 9999999999999999999 error, my script will stop but the data will be there, hardly a solution for automation.  What happens when your next releases come out that support even larger sizes for lpks, surely this is an issue that can be fixed on your end to support larger loads. 

Currently, I have divided my large lpk up into 6 additional lpks to see if they will load successfully, so will be starting that shortly, I do however, have a 200 MB lpk still and a 173 MB, so there still may be issues.  There has to be a workable workaround without me having to divide my parcel and parcel line layers into chunks, clearly that is crazy.
0 Kudos
TedCronin
MVP Honored Contributor
My new 169 MB file just failed, so I am somewhere between 20 MB and 169MB that is the threshold that works for loading data into ArcGIS.com
0 Kudos
MikeMinami
Esri Notable Contributor
Can you just ignore the error in your script? In our experience, a time out can occur, but the file does upload successfully. Let me know if you find otherwise as you have quite large files. We are currently investigating various ways to fix the time out problem, but it's not resolved yet.

Thanks,

Mike
0 Kudos
TedCronin
MVP Honored Contributor
Can you just ignore the error in your script? In our experience, a time out can occur, but the file does upload successfully. Let me know if you find otherwise as you have quite large files. We are currently investigating various ways to fix the time out problem, but it's not resolved yet.

Thanks,

Mike


I could if I end with this script and no processes need to be completed after through a subprocess, then after the script runs, the next day, I can manually close out of python, that may need to be the workaround, I will play with this a bit more, maybe just put in a try, except and see if that passes me through.

Thank you for staying on this Mike.
0 Kudos
TedCronin
MVP Honored Contributor
Sharing BookandPage.lpk
2011-01-13 17:57:50.598000
Sharing Condos.lpk
2011-01-13 18:01:15.692000
Sharing MineralRights.lpk
2011-01-13 18:02:56.535000
Sharing ParcelFabric.lpk
2011-01-13 18:04:00.395000
ERROR 999999: Error executing function.
Failed to execute (SharePackage).

Sharing ParcelLines.lpk
2011-01-13 18:32:11.520000
ERROR 999999: Error executing function.
Failed to execute (SharePackage).


Sharing ParcelReference.lpk
2011-01-13 19:05:33.051000
Sharing StateTRA.lpk
2011-01-13 19:09:53.082000


So, its not pretty, but so far its looking as a good to go for a workaround, in that I just downloaded Parcel Lines, and they look ok as far as I can tell (2.3 million features), currently downloading Parcels to have a look see as well.  As you see above I am passing the errors and moving on, so thats pretty cool, only problem was having to write 3 Xtra lines of code, which is just extra, but someday, perhaps I can remove them, at least I can finish this script up.  I should have thought about the try... except on my end before coming to the forums, so sorry about wasting anyones time on this issue.  At least we know there may be other issues with other large loads.

Thanks Mike for your help and guidance.


So, looking at the Parcels (775,000 features), they look like they are ok as well, so cool.
0 Kudos
MikeMinami
Esri Notable Contributor
No worries Ted. Glad it seems to be working. And thanks for always pushing the software to the limit!

Mike
0 Kudos