Hi,
Glad to hear you're trying out the WIM!
Which tools are returning the generic 99999 error? I have seen that happen before during Train Random Trees and have traced it back to the composite bands tool that runs on the backend. You can try opening the composite bands tool, navigating to environments, and changing the parallel processing factor to 0. Then, retry Train Random Trees.
All of the smoothing methods use the Raster to Numpy Array tool before performing operations. So input raster size constraints would be the same as what is listed for that core software tool, and it will vary depending on the RAM available. My machine has 64gb of RAM and I received the same memory error when clipping my DEM to the extent of a HUC12 watershed (1m resolution, 20694 columns and 17258 rows). Ideally, the tool should be able to process data at the HUC 12 scale, so I will look into releasing improvements to WIM to make that possible. In the meantime, I would recommend clipping to sizes closer to a HUC16 and working in chunks. I would also recommend using WIM's smoothing tool rather than Focal Statistics to smooth the DEM. Specifically, so that you can apply the Perona Malik method. Perona Malik has been shown to significantly improve wetland predictions and outperform other standard smoothing methods. Those results are presented in this paper https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2019WR024784
Also note that when you get to the Train Random Trees and Assess Accuracy steps, all rasters should have the same extents of your ground truth raster (i.e., using this raster as the snap raster, extent, mask environment setting), since training and accuracy assessment can only be done using cells where the ground truth data is known. Setting these constraints automatically for the user is a fix coming to a new version of WIM by the end of this week.
Hope that helps as a temporary fix, I will be working on a better one!