I have a Java app that has been released/deployed on PCs and laptops.
Recently, our customers are looking at getting more additional devices - more specifically, more portable devices which are lighter and smaller. I need to advise them the minimum requirements these laptops must meet.
We have grabbed a spare low-end laptop running on integrated graphics (if I remember correctly it's an Intel UHD 630), it's noticeably slower, and the map view would start to turn blank (either black or white out) after prolong use.
The map view includes offline map data (GeoTiffs, used as base map) and graphic overlays (mainly point geometries with marker symbols). The test on the low-end laptop was done without placing any graphics though.
With size as a constraint, options left available are generally to use an integrated graphics. I'm seeing intel 11th generation of CPU coming with Intel Iris XE graphics. Based on benchmark scores, they seem to be on par on other low-end discrete graphic cards (like GeForce MX350). Is that adequate? I'm certainly worried because the Iris XE GPU, being integrated, would not have any dedicated memory.
Anyone could advise on this?
Solved! Go to Solution.
Specifying a minimum spec for a SDK isn't easy. It really depends on the data you are using and how you are using it.
Theoretically if your laptop supports DirectX 11 (assuming here you are deploying to Windows platforms) it should work. A simple map application which has a basemap and a few dots on it will work on some pretty low specification machines.
If however you write an app which does intensive 3D visualisation, or even a 2D app which is displaying thousands of points or polygons all updating their location 50 times a second you are going to struggle with a low end GPU.
If resources are an issue with the device you are deploying to there are things you can do in code to make better use of the hardware. You mention use are using graphics overlays, so a starting point could be to play around with the Rendering mode for your graphics overlays (STATIC mode may help).
You also mention marker symbols. If you are displaying lots of graphics then if you use a Renderer for your symbols (Unique Value Renderer for example), this will save you some texture memory on your graphics card if you are currently creating a symbol for each graphic.
Sometimes writing an app for a low end device requires careful coding to make efficient use of limited resources.
Specifying a minimum spec for a SDK isn't easy. It really depends on the data you are using and how you are using it.
Theoretically if your laptop supports DirectX 11 (assuming here you are deploying to Windows platforms) it should work. A simple map application which has a basemap and a few dots on it will work on some pretty low specification machines.
If however you write an app which does intensive 3D visualisation, or even a 2D app which is displaying thousands of points or polygons all updating their location 50 times a second you are going to struggle with a low end GPU.
If resources are an issue with the device you are deploying to there are things you can do in code to make better use of the hardware. You mention use are using graphics overlays, so a starting point could be to play around with the Rendering mode for your graphics overlays (STATIC mode may help).
You also mention marker symbols. If you are displaying lots of graphics then if you use a Renderer for your symbols (Unique Value Renderer for example), this will save you some texture memory on your graphics card if you are currently creating a symbol for each graphic.
Sometimes writing an app for a low end device requires careful coding to make efficient use of limited resources.