I have used code like this to create a tileinfo for my custom tiled layer:
static public double resolution(int zoom) {
return INITIALRESOLUTION / Math.pow(2, zoom);
}
public static TileInfo CreateTileInfo(int minZoom, int maxZoom)
{
int dpi = 96;
ArrayList<LevelOfDetail> levels = new ArrayList<>();
for (int i = minZoom; i <= maxZoom; i++)
{
double resolution = resolution(i);
double scale = resolution * dpi * 39.37;
LevelOfDetail l = new LevelOfDetail(i, resolution, scale);
levels.add(l);
}
return new TileInfo(dpi, TileInfo.ImageFormat.PNG, levels, new Point(-20037508.342789244, 20037508.342789244, SpatialReferences.getWebMercator()), SpatialReferences.getWebMercator(), 256, 256);
}
Why did I choose 96 as the DPI? I don't know, except that it seems every ArcGIS server in existence uses 96 as its dpi.
Example:
World_Topo_Map (MapServer)
Tile Info:
- Height: 256
- Width: 256
- DPI: 96
- Levels of Detail: 24
- Level ID: 0 [ Start Tile, End Tile ]
- Resolution: 156543.03392800014
- Scale: 5.91657527591555E8
- Level ID: 1 [ Start Tile, End Tile ]
- Resolution: 78271.51696399994
- Scale: 2.95828763795777E8
and so on.
So far so good, but take a look at these that a user sent me.


On the left is a custom map view invented in house. On the right, one using ArcGIS Android SDK.
Both of them say 14, but it has a different meaning.
On the left, it is a one to one pixel ratio. 14 means LOD 14 of the tile server.
On the right, 14 actually means Scale of 1:36111.909643.
But you must be taking into account the dpi of 96 vs whatever the screen dpi is.
I theorized that if I set the dpi to exactly what the device dpi is - in my case 576, they would look the same.
I cannot make that work.
I do not think setting it to the screen dpi is best. But for some map sources, 96 may not be the optimal number, right?