Increasing the ???maximum number of records returned by the server???,

12568
20
Jump to solution
01-05-2014 11:06 AM
JamalNUMAN
Legendary Contributor
Increasing the ???maximum number of records returned by the server???,

If wondering what consequences we might have in case we increase the ???maximum number of records returned by the server??? from 1,000 (default) to 240,000

For example, in my case, I have a layer which contains about 240,000 records. I need to set the ???maximum number of records returned by the server??? to be 240,000 to make sure that all the records of the attribute table of the layer can be returned if a particular query is applied.

[ATTACH=CONFIG]30262[/ATTACH], [ATTACH=CONFIG]30263[/ATTACH]


Thank you

Best

Jamal
----------------------------------------
Jamal Numan
Geomolg Geoportal for Spatial Information
Ramallah, West Bank, Palestine
0 Kudos
1 Solution

Accepted Solutions
AnthonyGiles
Honored Contributor
Jamal,

You will find the response time from the server really slow and your application will struggle to handle the results and probably crash,

Regards

Anthony

View solution in original post

20 Replies
AnthonyGiles
Honored Contributor
Jamal,

You will find the response time from the server really slow and your application will struggle to handle the results and probably crash,

Regards

Anthony
by Anonymous User
Not applicable
Original User: nidhinkn

It will slow the performance of client applications consuming your map service, such as web browsers, and your GIS server.
0 Kudos
LeoDonahue
Deactivated User
Jamal,

If that query needs to return all the records, why not just maximize the efficiency of viewing that layer in your map in it's entirety? Is it a point layer or polygon?
0 Kudos
LeoDonahue
Deactivated User
For example, have you considered using a view to represent that query, rather than let the tool build the query?

http://resources.arcgis.com/en/help/main/10.2/002q/002q00000023000000.htm
0 Kudos
by Anonymous User
Not applicable
Original User: Jamal432@gmail.com

For example, have you considered using a view to represent that query, rather than let the tool build the query?

http://resources.arcgis.com/en/help/main/10.2/002q/002q00000023000000.htm



Many thanks guys for the help. your answers are very useful.

Sometimes, for particular query, the returned values are more than 1,000 where the end-user needs to export them as CSV file (from the web mapping application).

My current layer is parcel (polygon layer) which contains 240,000 records.

If performance is down for more than 1000 records returned, I would prefer to keep the number as is.
0 Kudos
LeoDonahue
Deactivated User
Jamal,

There are other ways to serve this layer efficiently.  I would explore them before you listen to advice that says your site will "probably" crash, with really no evidence given.

At 10.2, if you are licensed, you can publish that parcel layer as it's own service, using it's own site - group of ArcGIS Servers. ( I think that is what they are calling them now ).

*edit:  Clusters... is the word I was looking for.
0 Kudos
by Anonymous User
Not applicable
Original User: ad_giles@hotmail.com

The problem is not with the server being able to handle publishing services with a large amount of data it is with the browser being able to handle a large amount of features. I would pretty much guarentee that trying to display 240,000 features in the attribute table would cause the browser to stop responding, that is if the server can actually manage to return that number of features.

I have not tried to up the minimum features to that sort of figure as doing so would defenatly cause issues.

Regards

Anthony
0 Kudos
by Anonymous User
Not applicable
Original User: mboeringa2010

Many thanks guys for the help. your answers are very useful.

Sometimes, for particular query, the returned values are more than 1,000 where the end-user needs to export them as CSV file (from the web mapping application).

My current layer is parcel (polygon layer) which contains 240,000 records.

If performance is down for more than 1000 records returned, I would prefer to keep the number as is.


Jamal, while it probably is unwise to try to allow access to all 240.000 records directly through a web service, there is a huge difference between that 1000 current setting and the maximum 240.000 records of your layer. You might consider increasing it experimentally and incrementally in steps of a thousand records to see what is still acceptable. Maybe most of your users' requirements can be met by increasing it to say 5000 or 10.000 records, and you might discover this still works. But only testing can tell (or reviewing the detailed info and using the tools available in the ESRI System Design Strategies webpages, as the benchmarks for certain configurations listed there can be compared to your situation)

Of course, if current performance at 1000 records is already doubtful, than increasing it won't help you unless you take additional action to boost your sites performance, network or server hardware.
0 Kudos
by Anonymous User
Not applicable
Original User: ldonahue

The problem is not with the server being able to handle publishing services with a large amount of data it is with the browser being able to handle a large amount of features.

The browser?  I disagree.


I have not tried to up the minimum features to that sort of figure as doing so would defenatly cause issues.

You should check before you make that statement.

On a poorly configured map service, running 10.0, and using the SQL hack, I can get 240,000 parcel attributes in about 35 seconds.  No crashing.

[ATTACH=CONFIG]30299[/ATTACH]
0 Kudos