Re-generate esri_accumulated_stats.json for deep learning training

278
2
Jump to solution
09-06-2020 04:47 PM
TimGrenside
New Contributor III

Hello

Is there a way to re-generate "esri_accumulated_stats.json" after the “Export Training Data For Deep Learning”.  As often I export many areas of interest into separate distinct directories and want to combine all the images and labels into one directory to training on.  In this case it would be great to have something re-generate the json file base on the contents of the consolidated directory.

Would be nice if the maps.txt and stats.txt files were updated too.

Many thanks

0 Kudos
1 Solution

Accepted Solutions
GuneetMutreja
Esri Contributor

Hey,

While exporting the data if you pass the same directory every time for different areas, the directory will contain image chips for all areas of interest and all the related files (maps.txt, stats.txt and esri_accumulated_stats.json) will be automatically updated. You need not pass separate directories everytime for different AOI if the data is same, just use a single one.

View solution in original post

2 Replies
GuneetMutreja
Esri Contributor

Hey,

While exporting the data if you pass the same directory every time for different areas, the directory will contain image chips for all areas of interest and all the related files (maps.txt, stats.txt and esri_accumulated_stats.json) will be automatically updated. You need not pass separate directories everytime for different AOI if the data is same, just use a single one.

View solution in original post

TimGrenside
New Contributor III

Thanks Guneet.  Yes you are right, thanks for the help.  I had tried this before but was getting stopped on the below error (a bug?). 

ERROR 002860: Tool parameters are inconsistent with the data you are trying to append to.
Failed to execute (ExportTrainingDataForDeepLearning).

So to resolve this, the "Class Value Field" must contain exactly the same amount of classes as previously run exports in the output directory.  Seems kind of odd to me to have this limitation, but thanks again