Hello,
I want to process 231 polygons with about 20 000 ha (Mean 85 ha) with detect objects using deep learning tools. In the moment I process every polygon as a partial data set and it takes a lot of time on my notebook (more than 2 days)? Are there better ways to process the data? Bigger partial data sets? All in one step?
@JohannesBierer here are some tips
CPU vs. GPU: If you aren't using a dedicated NVIDIA GPU with CUDA, the tool defaults to the CPU. Deep learning on a CPU is roughly 5–50x slower than on a compatible GPU.
Tile/Batch Size: you can increase the Batch size parameter..