Select to view content in your preferred language

layer.setVisibleLayers() behaves differently in 10 from 9.3

827
1
07-20-2011 05:17 AM
AmandaMyer
Deactivated User
Hello,

We have run across what we feel is a bug.  In ArcGIS server 9.3, if you tried the following:

layer.setVisibleLayers([]);


it would hide all the layers appropriately.

When you do the same thing in ArcGIS server 10, it does not remove the last layer.

In our scenario, a user may choose to toggle layer visibility by clicking on a checkbox list of layers.  When they try to toggle visibility off for all layers, the last layer always remains, unless you toggle the visibility of it's parent layer as well.

You can see this behavior in the working example on esri's website too.

Is this working as intended?  Or is this a bug?  If it is working as intended, is there a workaround for this behavior?
0 Kudos
1 Reply
GarimaVyas
Emerging Contributor
To set no visible layers, you need to set the array that is passed into setVisibleLayers equal to -1. This works for both 9.x and 10 services.
        if(visible.length === 0){
          visible.push(-1);
        }
        layer.setVisibleLayers(visible);
0 Kudos