OpenData download problems

4317
22
Jump to solution
12-02-2015 01:02 PM
deleted-user-Bkz8cHvtt5bs
New Contributor III

There have been quite a few issues lately with downloads of data.  I even tried to change most of the datasets taht are not versioned to be a manual cache so that I can ensure 1 good download on ESRI servers.  Unfortunately, I've been receiveing a lot of complaints from my users, particularly about this dataset:

http://geodata.myfwc.com/datasets/68fbc06232164ce6b6b0d046e906b885_0

Can this be fixed on ESRI's end?

Tags (1)
0 Kudos
1 Solution

Accepted Solutions
DanielFenton1
Occasional Contributor III

Hi Chris, we need to figure out why this query is failing or slow:

http://atoll.floridamarine.org/arcgis/rest/services/FWC_GIS/OpenData_Elevation/MapServer/0/query?outSR=4326&where=OBJECTID%20%3E=%2014001%20AND%20OBJECTID%3C=15000&f=json&outFields=*&geometry=&returnGeometry=true&geometryPrecision=10

Open Data cannot produce a download from this dataset if this query fails repeatedly.

My best guess is that the heap size on your server is set to low. That means this query is simply to big to hold in memory.

You could try lowering the max record count for this service or increasing the allowed heap size. Do either of those help that query complete?

Daniel

View solution in original post

0 Kudos
22 Replies
DanielFenton1
Occasional Contributor III

Hi Chris, we need to figure out why this query is failing or slow:

http://atoll.floridamarine.org/arcgis/rest/services/FWC_GIS/OpenData_Elevation/MapServer/0/query?outSR=4326&where=OBJECTID%20%3E=%2014001%20AND%20OBJECTID%3C=15000&f=json&outFields=*&geometry=&returnGeometry=true&geometryPrecision=10

Open Data cannot produce a download from this dataset if this query fails repeatedly.

My best guess is that the heap size on your server is set to low. That means this query is simply to big to hold in memory.

You could try lowering the max record count for this service or increasing the allowed heap size. Do either of those help that query complete?

Daniel

0 Kudos
deleted-user-Bkz8cHvtt5bs
New Contributor III

Hi Daniel,

We are going to go through our OpenData services and reduce the max record counts for all of them to 1000 or 2000 records, but I wanted to verify that the big data download issue (i.e. the shapefiles only downloading approximately 1000 records) was resolved with the bootstrap update about 9 months ago.  The original reason we upped the max record count was to work around the truncated download issue.

I’d also like to say that I’m unfamiliar with the concept of the “heap size” setting on server.  I’ve researched it a little bit and found how to change it, but the ESRI help documentation simply stated that it should be changed to “an appropriate heap size.”  Do you know of any documentation to determine what an “appropriate” heap size would be?  For example, if our server is currently at the defaults of 64MB for the SOC heap size and 256MB for the server heap size, and we have about 40 services, should we quadruple the heap sizes or would doubling them suffice?  Also, would it be appropriate to only up one of the heap sizes?  Since, you’re suggesting to increase the server heap size, should we only increase the server heap size, or should we also increase the SOC heap size proportionally?

0 Kudos
DanielFenton1
Occasional Contributor III

Hey Chris,

Lowering your max record counts to 1k to 2k does sound appropriate in general. As you have noted, the process of creating downloads has changed significantly since the early days of Open Data.

Unfortunately the documentation is a little weak around recommend heap sizes. How much total ram is available on your server instance(s)? On machines with large amounts of RAM you should be able to increase Server heap size quite a bit. Without seeing the actual error that is reported by your server instance it’s tough for me to tell you whether you need to increase Server, SOC or both.

The Server heap is the total amount of memory available to available to all services while SOC represents the amount available to individual services.

All that said, reducing the max record counts to more appropriate levels will reduce the need for ArcGIS Server to use heap space when serving requests.

Daniel

0 Kudos
deleted-user-Bkz8cHvtt5bs
New Contributor III

Hi Daniel,

Thanks for this information.  Currently we have 37 services (I think) on one of our servers named atoll (atoll.floridamarine.org).  That server has 64GB of RAM available, of which we are currently only using 13%.  Our current heap size settings are the default 64MB for SOC and 256MB for the server.

0 Kudos
DanielFenton1
Occasional Contributor III

Wow, 64GB is a ton of RAM. You can safely increase your heap sizes quite a bit. I’m hesitant to give you specific numbers because I’m not an expert in ArcGIS Server. But I can certainly say the defaults were created for a machine with much less than 64GB of ram. Support may be able to point you to a better guideline.

Again, it may not be necessary at all if you lower the max record count to more appropriate values.

0 Kudos
deleted-user-Bkz8cHvtt5bs
New Contributor III

Thanks Daniel,

I also see a number of my datasets are coming up with errors stating that the “Record count query failed  exceeded timeout of 90000ms.”  Would this also be resolved by increasing the heap time, or is this something else in the service that needs to be optimized?

0 Kudos
DanielFenton1
Occasional Contributor III

It’s possible that those are related, but I really can’t say. I’d talk to Support and see if they can link you up with someone who specializes in Server Performance.

0 Kudos
DanielFenton1
Occasional Contributor III

Out of curiosity, are you willing to post the service definition?

0 Kudos
deleted-user-Bkz8cHvtt5bs
New Contributor III

Attached is the service definition file for the service that is throwing the “Record count query failed [because it] exceeded timeout of 90000ms” error.  Please let me know if you see any glaring problems.

Thanks,

Chris

0 Kudos